Cambridge News: Looking into 2013

It's that time of year again when we take a moment to reflect on all that has happened in the year just been and, perhaps more importantly, dream about what is possible as we enter a fresh, new year. It's unsurprising then that everyone's guilty pleasure is reading the flood of articles predicting what will we will see happen over the coming twelve months. Well, dear reader, I plan to go one step further! As those who work in technology will know, we spend a lot of our time future gazing, so making predictions that cover only the next year seems pretty tame. Here, I plan to talk to you about my predictions for what the next ten years will have in store for us.

Of course, I am warned by the scientific greats about the perils of committing oneself to predictions. The Danish physicist Neils Bohr's famously said, "prediction is very difficult, especially about the future". Others had warned him too, since the quote is originally attributed to Mark Twain, and the sentiment has been repeated since by a wealth of public figures, from George Bernard Shaw to Winston Churchill. Yet flying in the face of this advice, here are some crazy thoughts as to what we will the world will look like by 2025…

It's clear to almost everyone that connectivity will continue apace, but most interesting are the exciting developments that this will enable. The Internet of Things, already in its infancy right now, will become a reality: devices will be manufactured with Internet connections as standard, and will be built with an increasing number of embedded sensors. They'll become data mining machines, beginning to gain vast amounts of information about their environments: where they are, where they move, what interactions you have with them, etc.

And yet, despite the volumes of data production increasing exponentially, it is going to take a system that actually understands this data, and can synthesise data from multiple outputs (think audio and video recognition) in order for us to see true value in the age of IOT. Imagine a world where devices are not only fitted with cameras but have the ability to hear audio too. This means that devices will be able to actually see and hear the world around them – but this by itself is useless, unless these devices actually recognize what these inputs mean, and what actions should be consequently undertaken, so-called "smart devices" remain fundamentally unintelligent. Think about autonomous cars, which need to be able to look at and understand all different kinds of road signs, without human advice. The imperative is here, and I predict the development of very advanced machine intelligence algorithms, which will bestow intelligence and understanding upon devices in all situations.

In fact, the growing application of machine learning to smart devices is going to revolutionise the way we interact with information. We'll move from our current text-centric way of interacting with information, into one that closely mirrors human interaction: talking, asking, looking; not typing words on tiny smartphone screens as we walk along the street! In ten years' time, hardware will have improved to the point where wearables will be sleek and unobtrustive – able to present us with visual and audio information when we need it. Traditional keyword search engines will be obsolete, and we will think more in terms of ‘discovery' – being presented with information that is relevant to what we are doing, in the moment – closing the gap between our physical and virtual worlds.

Cyber security is going to move from an enterprise issue to one that is the forefront of everyone's mind. Our current way of dealing with cyber threats - trying to erect a digital "wall" between yourself and the bad guys, or software updates that patch up holes in code once threats are discovered – will be shown time and time again to be painfully inadequate. Necessity will mean we move to methods which are predictive and proactive, not just reacting to a problem once it's happened. To take a quick biology lesson: our bodies have developed an incredible immune system which learns what is good and bad, and deals with threats accordingly, ensuring we don't get too sick or die. Rules-based approaches won't work, and I believe these machine-learning approaches will become the new standard instead.

Speaking of health, the medical industry will see huge changes – for both better and worse. Antibiotic resistance will continue to grow and pose serious problems to treatment plans. I don't think we will be entering some pre-Alexander Fleming dystopia in the next ten years, but it will certainly mean extra drug development in order to fight infectious diseases. In the field of genetics, we can see parallels to the as-yet-untapped potential of IOT: the generation of massive volumes of data through increased genetic sequencing, but true insight and value will be only be gained when we can understand what all this data means. Again, like with IOT, a more holistic approach is needed: combining sequence reads with past prescription information, family histories and patient epigenetic data (for example, age and environment). This more human-like read of the data will build a much more valid picture of a patient, providing highly advanced personalized medicine to treating disease.

I know these are some huge leaps forward in just ten years, but I truly believe we are entering a period of hyper-change with technology fundamentally altering the world around us, very quickly. Then again, there's one sure thing that you can predict about predictions: they're always wrong.

A version of this article appeared in Cambridge News.

Previous
Previous

CIO Magazine: The World in 2017

Next
Next

Employee Ownership