Digital technology, media and intellectual property
Random header image at GB Media

A brilliant 2013 for science and technology. Ok, maybe not smartphones.

December 29th, 2013 |  Published in Big Data, Digital and Mobile Technology, IBM, Samsung, Telecom and broadband, The Smartphone Wars, Weekend tech diversions

Science and technology

 

If you’ve heard this story before, don’t stop me, because I’d like to hear it again. ~ Groucho Marx

29 December 2013 — During 2013 it seemed scientific discoveries were being made every day that changed the world we live in. From physics to medicine to biology, the technological and medical advancements that most people believed would never happen in their lifetime … well, happened. And continued to develop. These discoveries brought a myriad of new technology and techniques that will only grow and improve with time to make the world a better place to live in.

The brain mapping projects, the discovery of numerous Earthlike exoplanets, the development of viable lab-grown ears, teeth, livers and blood vessels, the atmospheric entry of the most destructive meteor since 1908, successful new treatments for diseases such as HIV, Usher syndrome and leukodystrophy, and a major expansion in the use and capabilities of technologies such as autonomous cars and 3D printing. After 3D printing has produced ears, skin grafts and even retina cells that could be built up and eventually used to replace defective eye tissue, researchers expect to be able to produce the first functioning organ next year. The organ, a liver, would not be for the purpose of human implant — that will take years to complete clinical trials and pass various medical and regulatory. Instead, the liver would initially be for development and testing of pharmaceuticals. The field of 3D printing known as organs on a chip, will greatly increase the accuracy and speed of drug development and testing.

For me, the top discovery was immunotherapy against cancer, namely the recruitment of the immune system to eliminate tumor. Immunotherapy against cancer changes the mindset of researchers on how to treat the cancer. I will write more about this next year.

Well, then there was all that “uncreative destruction” over there in the smartphone ecosystem. Not that the most popular surveillance device we stick in our pocket wasn’t being talked about. Edward Snowden … now throwing us into what appears to me to be a growing wave of surveillance fatigue … at least revealed to many of us (who have not been following the excellent Wired magazine series these last 4-5 years the uncomfortable truths about how today’s world works. Technical infrastructure, geopolitical power, rampant consumerism, ubiquitous surveillance, ever-increasing internet control. They should make a movie. Oh, sorry. It will be a movie in 2014.

Ah, yes. All those interconnected in ways most of us would rather not acknowledge or think about … much of it via our smartphones. We focused (mainly) on just one element in this long chain – state spying. The ambivalent future of “digital capitalism” (first coined by Michael Betancourt but developed by Evgeny Morozov) has not been much discussed, as the relocation of power goes from Washington and Brussels to Silicon Valley. As Morozov has written, governments, always seemingly strapped for cash and low on infrastructural imagination, surrendered their communications networks to technology companies.

It seemed, though, that innovation was replaced by financial engineering, mergers and acquisitions, and evasion of regulations. American tech firms flocked to Ireland in order to avoid regulation, or just concocted anything they needed to avoid regulations … anywhere.

Not a single breakthrough product was unveiled. I do not count Google Glass. That’s smartphone technology still in search of an application. It just seems as though the industry willingly retreated from a culture of risk and exploration towards one of safety in set profit flows. We discarded a century of can-do ambition built on rapid advances in technology and replaced it with a cautiousness far too satisfied with incremental improvements.

Ok, perhaps I am too cruel. It is not the devise anymore. It is the application. That is the innovation. For the past two years I have kept to a reduced conference circuit: FutureMed, the IBM cognitive research/Watson events, LIFT, and the Mobile World Congress. I did add DLD Tel Aviv Digital Innovation. I had a chance to meet Yossi Matias, the managing director of Google’s R&D Center in Israel and Senior Director of Google’s Search organization. Clearly one of the smartest guys around. DLD will be added this year.

If I learned nothing else from these conferences it was that every smartphone-addict and Google-whacker is continually evolving their thinking by making the most of today’s digital tools, and out thinking will continue to evolve as new tools enter our lives. That is the new, marvelous cognitive landscape.

And I certainly have jumped on board with new applications never seen before. Eric De Grasse, my Chief Technology Officer, has constructed for me a “cluster” program that “reads” all of my incoming email and any articles I download and categorizes them, sending them to the appropriate reading list or folder on my master Gmail and iCloud accounts to access on whatever devise I am using. Magna productivity.

 

I think the true innovation, the brilliant innovation was in other areas, such as the neuro/chip processors unveiled that will narrow the gulf between artificial and natural computation—between circuits that crunch through logical operations at blistering speed and a mechanism honed by evolution to process and act on sensory input from the real world. Caltech’s Carver Mead pioneered “brain inspired” computing in the 1980′s, based on theoretical math and logic. The University of Zurich/ETH, DARPA, Intel, IBM, Qualcomm, and others, as well as the “deep learning” initiatives of Google and IBM, were the true tech advances this year.

Oh, and Artificial General Intelligence (AGI), a discipline that seeks to render in a computing device the human brain. One of the first things you learn when you study neuroscience … in fact, it was in my very first course … were the many things that make humans a unique species. The stand outs of course are our mind, and our brain. The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking.

Many have been prophesying that we will soon achieve the ultimate breakthrough in AGI … human thinking. So, so difficult. And I think the ball is clearly bouncing between two courts: Google and IBM. If you want to wrap your mind around the prevailing thoughts, the supreme difficulties there is a plethora of sources. But having spent the last two years immersed in the subject I highly recommend you start with just two short pieces by members of opposing camps: David Deutsch’s “Creative Blocks: The Very Laws of Physics Imply That Artificial Intelligence Must Be Possible. What’s Holding Us Up?” and Ray Kurzweil’s “The Real Reasons We Don’t Have AGI Yet.”

The difficulties? I tend to agree more with Deutsch: “It [is] a failure to recognize that what distinguishes human brains from all other physical systems is qualitatively different from all other functionalities, and cannot be specified in the way that all other attributes of computer programs can be. It cannot be programmed by any of the techniques that suffice for writing any other type of program. Nor can it be achieved merely by improving their performance at tasks that they currently do perform, no matter by how much.” 

But Kurzweil is correct, too, when he says that the biggest issue is the weakness of current computer hardware , rapidly being remedied via exponential technological growth.

These discussions are going to be quite important in 2014. As search systems do more thinking for the human user, disagreements that appear to be theoretical will have a significant impact on what information is displayed for a user. Do users know that search results are shaped by algorithms that “think” they are smarter than humans? Good question.

Postscript:

The “selfie”, the Oxford English Dictionary’s official word of the year, is simply perfect for our time. It’s immediacy is the key. Online, we can alter or completely reinvent every facet of our selves – from our personality and profession to our gender, ethnicity and name. In this fakery free-for-all, authentic identity is increasingly derived not from who we are or what we do, but from what we are doing right now. If we have no thought to Tweet or photo to post, we basically cease to exist. So while the selfie may seem narcissistic, it is not motivated by narcissism so much as our digital existential angst. It is what technology has done to us.

And angst it is. Computing power is starting to solve everyday problems. Language recognition has now eliminated the wall that prevented robots working alongside humans. One of the most sobering pieces of evidence this year regarding the impact of technology on the job market was the London School of Economics study … by the same folks that coined the term “job polarization” a decade ago … which said that employment in the UK and the US had been rising for people at the top and the bottom of the income scale. There was more demand for lawyers and high-level technos … and burger flippers. It was middle-skill jobs that were disappearing. Strong gains at the top, some gains at the bottom, stagnation in the middle.

And that’s the angst: will we be equipped to deal with the possibility that in future, there will be people who – despite being willing and fit to work – have no economic value as employees?

So, you want fries with that?

 

I’ll leave you with a video by David McCandless on the beauty of data visualization. David is one of my favorites, a British data-journalist, and information designer based in London. He is the founder of the visual blog Information is Beautiful. David turns complex data sets, like worldwide military spending, media buzz, and Facebook status updates, into beautiful, simple diagrams that tease out unseen patterns and connections. Good design, he suggests, is the best way to navigate information glut — and it may just change the way we see the world. Happy New Year.

 

 

 

 

TUMBLR Xmas message

 

 

 

About the author


Email | All posts by

"The mind that lies fallow but a single day sprouts up follies that are only to be killed by a constant and assiduous culture."
Latest Videos

Un aperçu de la FIC 2017 / A quick look at FIC 2017 (Lille, France)

Cybersecurity: a chat with John Frank, Vice President EU Government Affairs for Microsoft

From Legaltech NYC 2017: a chat with Andy Wilson of Logikcull

5G is coming ... and it's going to blow you away. Yes. Really.

The Internet of Things ... or the cybernetic consortia? (Part 1)

From the Mobile World Congress 2016: an introduction