An upgrade to the Lens feature will allow you use a camera to learn more about what you’re eating.
If you’ve followed the tech-insider comedy Silicon Valley this season, you’ll know that the hope/bane of ragtag incubator and would-be startup creator Erlich Bachman's career has been a “Shazam for food” app developed by his unwilling protégé Jian Yang. Shazam, of course, is the app that listens to music or other ambient sound and can identify which song is playing, making it easy to discover new bands or artists. Marry that concept with pointing a camera at food, and you’ve just created the fictional SeeFood app from the sitcom… or Pinterest’s latest real-life upgrade to its Lens feature.
It what seems like either perfect or imperfect timing, the very same week SeeFood was demo-ed on the TV show, Pinterest debuted a few food-centric tweaks to its software, and that includes Lens, a visual discovery tool that essentially lest you search pins based on a photo or video. The tool has already been live for items like clothing and housewares, but now supports food and actual dishes which will allow users to find recipes, facts or other content related to whatever it is they’re looking at. It’s one of many useful ways technology can make everyone’s food and beverage experiences even better (including not getting shorted on beer).
According to The Verge, Pinterest isn’t the only web company looking to make visual search as ubiquitous as Googling. In fact, Google itself recently rolled out its own lens tool, and Facebook is also developing the capability. Heck, even the HBO series had their in-house tech consultant build a version of the SeeFood app. But, as it did on the show, the app came with the limitation of only recognizing hot dogs and “not a hot dog.” If you know anything about the tone Silicon Valley, you can guess where they went with that…