Robots and autopilots might correct for human error, but they cannot compensate for their own designers. Perhaps a brighter technological future lies less in the latest gadgets, and rather in learning to understand ourselves better, particularly our capacity to forget what we’ve already learned. The future of technology is nothing without a long view of the past, and a means to embody history’s mistakes and lessons, as we plow forever forward.
When Facebook was relatively new, that's exactly what its users (including me) did. In many ways, it was a better short-blogging platform than LiveJournal, and many us of left to enjoy the future of personal sharing.
Then, over time, Facebook added new kinds of content to the News Feed, and added one-click re-sharing. People eventually found it easier to just click mouse buttons than make their own statements.
Next, they started curating people's News Feeds, selecting the content Facebook thought was important, instead of showing a raw stream of their friends' updates. Personal updates were squelched; linkbait and social gaming (Zynga, anyone?) were promoted.
To make matters worse, they stopped allowing subscribers to control their own News Feeds' content. Sure, you could hide updates from certain posters, or express a vague preference to "hide content like this" (whatever that means), but you could only do that one item at a time. There was never a way to control the News Feed to filter out everything but personal content. As a consequence of these maneuvers, Facebook expressed a clear business preference to keep people engaged through low-quality, quick-dopamine-hit content.
And those who continued to try to use Facebook as they'd done before noticed these changes. Their "Like" counts went down. People stopped commenting. In fact it became clear that perhaps what they were posting -- even party invitations -- were never being seen by their friends at all.
上周Nvidia技术峰会上，IBM展示了利用Nvidia GPU 之后，其Watson如何大幅升级的情况，当Watson被装进机器人会怎样？
The challenge IBM faces is keeping Watson fresh, with the world's devices producing some 2.5 exabytes of data every day that is expected to blow-up to 44 zettabytes by the year 2020. To keep up with the information overload, IBM announced late last year that it was adding NVIDIA's Tesla K80 processing engines to the mix. Those high performance compute GPUs are playing a key role in Watson's cognitive computing development, especially in terms of natural language processing capabilities.
The result? Watson is more capable and human-like than ever before, especially when injected into a robot body. We got to see this first-hand at NVIDIA's GPU Technology Conference (GTC) when Rob High, an IBM fellow, vice president, and chief technology officer for Watson, introduced attendees to a robot powered by Watson. During the demonstration, we saw Watson in robot form respond to queries just like a human would, using not only speech but movement as well. When Watson's dancing skills were called into question, the robot responded by showing off its Gangnam Style moves.