AI / Machine Learning
It's good to see that there's open source Siri / Alexa with intelligent backends are being developed.
Apple is rumoured to be working on their own AI chip with a focus on improving battery performance for AI type of applications which makes sense in light of Apple's product lineup.
Not new, first time I saw it. Projector with touch interface. Pretty interesting how it bring interfaces to surfaces. Just imagine having in this on your phone. Runs Android.
The three-year EU-funded project, dubbed the Decentralised Citizen Owned Data Ecosystem (DECODE), will see a total of four pilot trials launch in Barcelona and Amsterdam at the end of 2017. In each city, 1000 people will be given an app through which they can share data about themselves to help companies or government groups create products or services to improve the city.
Each citizen will be able to decide exactly how much of their data is uploaded to the platform and how it should be used. For example, a person may decide that location-tracking data about parks they visit can be used by the city council but not private companies.
I think this one of the more interesting approaches to use blockchain technology. The idea to share data publicly but keep control on who has access to this data is a great goal and lacking today's internet.
In 1634, the rage among the Dutch to possess [tulips] was so great that the ordinary industry of the country was neglected, and the population, even to its lowest dregs, embarked in the tulip trade. As the mania increased, prices augmented, until, in the year 1635, many persons were known to invest a fortune of 100,000 florins in the purchase of forty roots.
At last, however, the more prudent began to see that this folly could not last for ever. Rich people no longer bought the flowers to keep them in their gardens, but to sell them again at cent per cent profit. It was seen that somebody must lose fearfully in the end.
However, as Thompson wrote in the paper, “appearances are sometimes quite deceiving.”
As Thompson explains, tulips in fact were becoming more popular, particularly in Germany, and, as the first phase of the 30 Years War wound down, it looked like Germany would be victorious, which would mean a better market for tulips. In early October, 1636, though, Germany suffered an unexpected defeat, and the tulip price crashed, not because it was irrationally high, but because of an external shock.
The internet is broken. It has been for a while. Even the fathers of the internet, Sir Tim Berners-Lee and Vint Cerf, say that it's broken. We realize you are probably reading this on the internet, and it seems to be working just fine.
I just published my own post Native vs Web when this appeared on Hacker News - even though it was published a week ago. It gives a good insight from a developer's perspective.
Walt Mossberg's final column, he's been part of my (computing) life as far as I can remember and was (is) an icon of the industry.
In his column he writes:
I expect that one end result of all this work will be that the technology, the computer inside all these things, will fade into the background. In some cases, it may entirely disappear, waiting to be activated by a voice command, a person entering the room, a change in blood chemistry, a shift in temperature, a motion. Maybe even just a thought.
This is ambient computing, the transformation of the environment all around us with intelligence and capabilities that don’t seem to be there at all.
Computers have gotten vastly easier to use, but they still demand attention and care, from charging batteries to knowing which apps to use and when to use them.
But it’s also been about objects and processes. Soon, after a brief slowdown, the roller coaster will be accelerating faster than ever, only this time it’ll be about actual experiences, with much less emphasis on the way those experiences get made.
I totally agree with this vision. For all the talk about wearables, voice assistants, AI, machine learning, virtual reality, augmented / mixes reality, ultimately we made great breakthroughs in putting computing in most people's hands. The next step is to make it disappear.