/The decade that disrupted us (via Qpute.com)

The decade that disrupted us (via Qpute.com)

As the 2010s draw to a close and we look forward to the new 2020s, Bronwyn Williams takes a quick trip in a hypothetical time machine back to the start of the last decade of disruption and the technological milestones that defined it.



Although, thanks to Brexit and that wall, the utopian globalist agenda may be showing strain at the moment, we can find comfort in the emergence of our new favourite international language – the emoji.

In 2010, the first emoji character library was accepted into Unicode, thereby recognising the little symbols we are all so familiar with today as an official universal “internet language”.

Our contemporary hieroglyphics unite generations and nations in a common visual tongue.

They could also, however, mark the beginning of the end of the age of literacy as we know it today. After all, who needs to text letters when a picture says a thousand words?





In 2011, the first Uber drivers took to the streets, accelerating both the on-demand and the sharing economy. Commuters could catch a ride at the touch of a button on their smartphones – and no longer needed to own their own vehicles to benefit from the convenience of having their own cars.

Today, the e-economy has evened out to allow us to share anything from housing to handbags – thanks to companies such as Airbnb and Rent. The take away is that we no longer need to own an object to enjoy it. Also worth considering is how Uber and its fellow gig economy firms are work and law.

Society is still grappling with how to deal with gig workers who report to an app, rather than a human boss, and are not covered by traditional labour laws.





The world is in the middle of a “sex recession”. Teenagers and young adults are more likely to be virgins than their parents and grandparents were at the same age. This phenomenon may or may not be linked to the way interactions are increasingly taking place online, rather than in person.

Tinder, the infamous dating app which launched in 2012, is just one example of how people are turning to technology to help connect with each other – with varying degrees of success.

Clearly, though, we are missing something from our fellow humans in our digitally connected world. More and more people are turning to artificially intelligent chat bots, such as Microsoft’s Xiaoice, which has more than 100 million “friends” for companionship.



The move towards veganism and vegetarianism is a growing global trend. In the US, for example, one in four 25- to 34-year-olds do not eat meat.

Then in 2013, science gave us a way to have our cow and eat it too – in the form of synthetic, cruelty-free lab-grown burger patties that look and taste just like the real deal.

Looking ahead, as startups such as Future Meat Technologies make high-tech foods become more accessible, acceptable and affordable, it is likely future generations will view killing animals for food to be a barbaric, embarrassing relic of human history.

Other faux food firms such as Perfect Day and Clara Foods are replacing milk and eggs with artificial imitations indistinguishable from the “real” product.



In 2014 China started piloting its ambitious, ubiquitous “social credit score” system to track and rank citizens based on online and offline behaviour.

Built around a national surveillance network, the system rewards “good” citizens and punishes offenders. Individuals with low scores are denied access to services and freedoms such as using public transport or attending top schools.

Similar human quantification systems can be found in capitalist countries, where consumers are tracked, rated and rewarded by the companies that serve and sell to them. South Africans are familiar with behavioural rewards (and punishment) systems employed by medical and vehicle insurers. Rule by behavioural economics, or “nudge” is set to grow.



Alpha Go

Alpha Go

In October 2015, Alphabet’s artificially intelligent computer programme AlphaGo, beat a professional human Go player for the first time.

This impressive feat of computing prowess reignited the global conversation around the future of artificial intelligence (AI), and the possibility of the so-called Singularity – that is when an AI becomes smarter than the entirety of human intelligence. It also reawakened concerns about artificially intelligent machines and algorithms replacing human jobs and perhaps leading to a global post-work economy.

Since then, AI and machine learning have progressed to the point that the world’s top Go player, Lee Se-dol, has retired in defeat, stating that AI “cannot be defeated”.



In 2016, the website BuzzFeed coined the term ‘fake news’ in response to a spate of plainly inaccurate, yet intriguingly titled, web articles originating from Macedonia.

Since then, the lies have continued to spread around the world, influencing elections from the US to the UK and South Africa, while the truth limps along behind trying to clean up the fallout.

Fake news, spread via viral clickbait articles shared on social media, has become a global phenomenon with wide-reaching consequences.

Its impact can be felt everywhere – from the growing anti-vaccination movement to blame for the re-emergence of once-eradicated measles outbreaks, to the spread of dangerous populist political ideas and the rise of extremist political parties globally.



If fake news was problematic, it was only the start. In 2017 a Reddit user came up with the term “deepfake” to describe a series of videos he had edited, using a machine-learning algorithm, to transpose famous people’s faces – on to porn footage – to create convincingly realistic fake movies. In the age of the deepfake we can no longer trust our eyes or ears, as sitting presidents and corporate leaders have discovered to their detriment. Unscrupulous agents can now literally place fake words into real people’s mouths, and put real people into really compromising situations.

Seeing is no longer believing.



Last year, the first genetically engineered human babies, twin girls, were born in China, ushering in the age of intelligent designer babies.

The girls had been edited using CRISPR Cas-9 technology while still embryos. As the technology progresses, and as more and more governments allow genetic engineering of humans and human embryos, we are sitting on the precipice between natural selection (evolution) and intelligent design.

The ethics of what should be allowed (for example, the eradication of genetic illnesses) and what should be restricted (such as selecting and editing human embryos for good looks or superior intelligence) will be some of the most important questions the human race needs to answer in the years ahead.



If you thought the last decade was disruptive, just wait until you see what comes next.

This year, Google announced that it had achieved quantum supremacy – in other words that the company had managed to demonstrate a successful application of quantum computing.

Should the technology continue to progress from this early sign of success, quantum computing could dramatically increase the processing power and speed of computers as we know them today.

Then add 5G speed internet – which is set to roll out in China in early next year – to the mix and we can look forward to another decade of superspeed disruptions.

Bronwyn Williams is trend translator at FluxTrends.com

This is a syndicated post. Read the original post at Source link .