Advancement in the quality, mobility, and adoption of screen-display devices allows for video content to be streamed through many new interfaces in various places, from personal tablets to smart TVs. With access to these devices becoming cheaper and more video content becoming accessible, people are using devices to stream content more frequently from the comfort of wherever they may be at any time. Unfortunately, many local news stories are published in text-based (i.e., static) format only, making them accessible on web browsers but inaccessible on video browsers and platforms. With Bloom Labs' dedication to the advancement of local news accessibility, our team initiated this R&D effort to learn how text-based stories can be made accessible as video.
This R&D effort collaborated with Rideplay TV, which provides tablets that distribute content to commuters in rideshare vehicles. Our goal was to learn how recently geotagged news stories near the vehicle's destination could be programmatically converted and streamed on the tablet for the commuter to view.
Bloom Labs developed a prototype that took a location input to curate a text-based feed of local news stories and convert them into a file formatted for HTTP Live Streaming (HLS). The file was then made available to the tablet's application and updated automatically twice per day for morning and evening news. For the user experience, one story was shown at a time and included its title, image, location, and publisher name. A QR code was generated for each story URL, allowing the commuter a contactless connection to read the story from their device.
The prototype has been tested privately but has yet to be fully implemented and tested with commuters. Further development and testing is planned at a later date.