Technology 2019: Another Year of Breakthroughs
Radhika Daga

The world is at the cusp of Fourth Industrial Revolution. 2019 has emphasised on the opportunities and growth from a sound digital economy. Technologies like Artificial Intelligence (AI), Blockchain, Internet of Things (IOT), Natural Language Processing (NLP), 5G, Quantum Computing, among others have seeped in to dominate every aspect of our lives. However, the success of these strategic and smart technologies are wholly centered on a treasure trove containing data. Data has been said to replace gold by the end of this decade, the kind which is abundant in supply. Notably, the value of data remains for as long as it can be used to make a meaningful deduction or change.1

As the world gets ready for the fifth generation of ICT, there has been a dramatic increase in the reach and exploitation of data for growth. Governments and businesses have sought to leverage new technologies for better efficiency and profits. China has an apparent lead in the AI related research. While more of data-centric business models have secured a hike in investments and venture capitalism. Automation is the new normal across industries now. 53% of global data and analytics decision makers have implemented or are in the process of implementing or are expanding or upgrading their implementation of some form of AI. 2 Availability of cloud-based platforms like Google cloud platform, Amazon Web Services and others have further boosted the machine learning investments.3

AI was mainstreamed for most of the technological progress made round the year. However, as a matter of fact, a myriad of technologies are evolving and faster than ever. The upcoming models and concepts have promised to disrupt every possible functioning structure, but for the better! The change is now inevitable and its acceptance only deferrable. Some glaring examples from the past year are worth recording in the archives and briefly discussed subsequently.

Natural Language Processing

In 2019, breakthroughs in Natural Language Processing (NLP) were quite a success for the future of data science and analytics. With the explosion in unstructured and non-numerical data, advances in NLP can make a large dataset an intelligible one for the computer. Through deep learning, the machine is trained on huge repository of data to take on rudimentary tasks like voice recognition, reading comprehension, summarisation and question answers4 to high-level cognitive ones which deems the machine to think, judge and comprehend like the human mind.

Google, with its new transfer learning technique5 BERT, has rewritten the fate of NLP. It has established state-of-art model applicable to 11 individual NLP tasks. BERT is pre-trained on a large corpus of unlabelled text including the entire Wikipedia (that’s 2,500 million words) and Book Corpus (800 million words). The ‘B’ signifying its bidirectional model, defines the ability to dig for the context of a word from both its right and left side.6 Based on Google’s transformer architecture from 2017, some groundbreaking developments has taken place in the last year which also includes Google’s TransformerXL, OpenAI’s GPT-2, XLNet, ERNIE2.0, RoBERTa, etc. Google’s Transformer-XL, outperformed BERT in Language Modeling. This was followed by OpenAI’s GPT-2 – a model that became famous for generating very realistic human-like text by itself. 7


There has been staggering progress in the field of genome-editing technologies. CRSIPR, as a gene-editing tool, has been identified to take on cancer and another bunch of genetic diseases in humans, plants and even animals. The Chinese Scientist He Jiankui, who created the first gene-edited babies using CRISPR, was condemned for the experiment and sentenced to three years in prison recently. However, even amid such calls to ban it, creation of gene-edited babies has seen a resolve by scientists particularly in Chinese and Russian science communities. Last year saw the astronauts aboard International Space Station (ISS) apply this technology to better the adaptability of humans in micro-gravity. There has been an attempt to improve the quality of the animals that we eat through engineering their genes. And this technology has got a shining future for the agriculture yield in the increasingly difficult climatic conditions to be faced in the times ahead.

Further, to avoid the possibility of accidental errors in this widely popular gene-editing tool, an evolved method of prime editing has been announced by scientists in 2019. This approach has the potential to correct up to 89% of known disease-causing genetic variations.8

Quantum Computing

With all the big players making advances in their own quest, Google’s announcement of quantum supremacy9 (2012-when quantum computer surpasses the best supercomputer) stole the thunder last year. In what they defined as the “sputnik” moment for quantum computing, Google revealed that 10,000 yearlong computation by a supercomputer was done in 200s seconds by its machine. While its chief quantum computing rival IBM, also having announced its 53 qubit model, has challenged this claim. AWS, Rigetti Computing, Intel and Microsoft are amongst the others to have added in the progress of this technology. However, its ability to crack into encryption, as it is today, remains leaps ahead. 10


The nature of war is acutely altered by technologies that have implications for the battlefield. Robotics, artificial intelligence, materials, additive manufacturing (also known as 3D printing), nanoenergetics, among others are dramatically changing the character and nature of conflict in all domains. Innovations in military technologies have given rise to highly sophisticated weapons. Well into the era of ‘killer robots’, autonomous armed drones have been effectively put to use in the last year. The success is not surprising when scientists claim that technologically, autonomous weapons are easier than self-driving cars. 11 Intelligence systems have almost reached a level where they can be deployed to fully manage the battlefield operations. And exoskeletons are being introduced to augment the performance of soldiers on the battlefield.12

Apart from the conventional war-fighting, there is a greater threat from adversaries in a virtual battle-field today. The ubiquity of cyber-attacks and information warfare has not only introduced a fifth domain of warfare, it can have deep implications on a country’s security. These attacks have flown in from unknown entities and had varying economic as well as security consequences. It has been indicated that a major cyber-attack can be paralleled to a nuclear attack; e.g. in case of attacks on the critical infrastructure of an economy.13

Debates & Deliberations

The potential of emerging technologies has been viewed with great pessimism too. The way technologies are advancing, they can fill in for every shortcoming in human lives. But coincidentally it overlaps with everything Sci-fi movies had warned us against. There is a growing skepticism against what is dubbed as “singularity” and the reality we are steadily approaching.

The last year saw an enhanced focus on ethics in Artificial Intelligence (AI). There is a growing doubt, suspicion and awareness of AI's limitations. 14 The amount of data demanded to realise the full benefits of the above mentioned technologies, is chilling. More so, when this data is breaching the privacy of users. The question confronting everybody alike is how to empower marginalised communities instead of exploiting them. And for that how to monetise data which is being extracted from people in all sly ways. The other part of the threat emanates from ‘the great decoupling’, that is, the gap between increasing productivity and employment. Technology is giving a tough competition to humans at work.15 Lastly an extensive campaign to “stop killer robots” is gaining roots against the Lethal Autonomous Weapon Systems in the military domain.

Social media has moved from being a mere channel of communication to an immensely powerful tool shaping and manipulating everything from politics and governance to private lives of people. The advancements made in Generative Adversarial Networks(GANs)16 have led to the proliferation of hyper-realistic deep-fakes, which are now being used to target women and erode people’s belief in documentation and evidence. 17 But thanks to some of these gaffes that were uncovered in the past year, people are getting more aware about the real picture not shown on their screens.

In the above light, it is expected that the current year would facilitate and energize the movement towards 'Explainable AI' to provide more transparency, accountability, and reproducibility of AI models and techniques. Notwithstanding the downside of the disruption, these debates are the only way to remind us of the power we still exercise over these technologies. This pushback has compelled the companies to pay more attention to protecting user privacy and combating deepfakes and disinformation. Though AI ethics guidelines remain vague and hard to implement, big techs are being motivated to the needful.

Consequently, raising voices will have the governments to answer the call for a robust framework for their data policies. The AI industry is full of opportunities akin to risks. Contrary to the belief, the new set of industries powered by AI can also lead to job creation, albeit of a different nature and capacity. According to a new report from the World Economic Forum (WEF) called “The Future of Jobs 2018”, the growth of artificial intelligence could create 58 million net new jobs in the next few years.18

India for Technology

March of technology is neither retractable nor demarcated. The talks of entering a Fourth Industrial Revolution are getting real by the day. This clearly demands our experts and policymakers to work in tandem to propose thoughtful new legislation meant to rein in unintended consequences without dampening innovation.

Alongside, there is a growing incidence of skills gap which will only magnify with coming of Industry 4.0. The country’s large population has rather been quick to adapt to change brought about by smartphones and the internet in a matter of few years. With such a technological bent of mind, the youth population (more than 65% of India's 1.3 billion people are under the age of 35, with more than 50% under 25) is strongly inclined towards digital entrepreneurship. India also ranks at the third-largest start-up ecosystem in the world, as the country now has 26 start-up companies valued at more than $1 billion each.19

The government has implemented several initiatives like Startup India, Make in India, and a comprehensive Digital India Programme. The Ministry of Skill Development and Entrepreneurship has also been established in 2014 to promote the cause. However, to remain relevant and atop the leaderboard, India needs to give free rein to the rescaling and retooling of existing structures that the 4th Industrial Revolution necessitates. And while there will be a strong demand for technical skills like programming and app development, skills that computers can’t easily master such as creative thinking, problem-solving and negotiating also need to hone.

Apart from the civil sector, there has been an upswing in technology based developments in national security institutions. Not long ago, a Defence Cyber Agency and Defence Space Agency was set up to acknowledge the changing nature of warfare. There is a visible awareness on the role of AI and Big Data Analytics to fight the future wars. AI Task Force (2017), has made recommendations for the incorporation of data-driven technologies in aviation, naval, land systems, cyber, nuclear and biological warfare arenas. Initial tenders or RFIs (requests for information) were also floated on dual-use AI capabilities.20 The time is ripe for India to promote in-house qualitative data collection and analyses even for its impending goals like the ‘Blue Economy’. Revision of policies, as in the case of appointment of the Chief of Defence Staff, will also reinforce the adoption of futuristic technologies in the face of this revolution.

Notwithstanding the immense potential reflected by the country’s pace of growth, timing is critical to reap the benefits of the next phase of technological revolution.

  1. "Data is neither the new oil nor the new gold… - Cisco Blogs." 9 Oct. 2018, Accessed 7 Jan. 2020.
  2. "Top Artificial Intelligence (AI) Predictions For 2020 From IDC ...." 22 Nov. 2019, Accessed 7 Jan. 2020.
  3. "A Technical Overview of Machine Learning and Deep Learning." 23 Dec. 2019, Accessed 7 Jan. 2020.
  4. "Better Language Models and Their Implications - OpenAI." 14 Feb. 2019, Accessed 7 Jan. 2020.
  5. Transfer learning is a research problem in machine learning that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem.
  6. "Demystifying BERT: The Groundbreaking NLP Framework." 25 Sep. 2019, Accessed 8 Jan. 2020.
  7. "A Technical Overview of Machine Learning and Deep Learning." 23 Dec. 2019, Accessed 8 Jan. 2020.
  8. "CRISPR Cheat Sheet: The Most Important Gene Editing ...." Accessed 8 Jan. 2020.
  9. Supremacy-In 2012, John Preskill, a theoretical physicist at the California Institute of Technology, coined the phrase “quantum supremacy” to describe the moment when a quantum computer finally surpasses even the best supercomputer. The term caught on, but experts came to hold different ideas about what it means.
  10. "Google quantum computer leaves old-school ... - Cnet." 23 Oct. 2019, Accessed 8 Jan. 2020.
  11. "“Relatively Easy” to Deploy Killer Robots by 2021 - Futurism." 21 Jun. 2019, Accessed 2 Jan. 2020.
  12. "The Coolest (and Scariest) Military Tech of 2019 - Futurism." 27 Dec. 2019, Accessed 8 Jan. 2020.
  13. "Scientist: Major Cyberattack Could Be as Bad as Nuclear War." 20 Aug. 2019, Accessed 7 Jan. 2020.
  14. "AI, Analytics, Machine Learning, Data Science ... - KDnuggets." Accessed 8 Jan. 2020.
  15. "In 2020, let's stop AI ethics-washing and actually do something." 27 Dec. 2019, Accessed 8 Jan. 2020.
  16. Task in Machine Learning which involves automatically discovering and learning the regularities or patterns in input data in such a way that the model can be used to generate or output new examples that plausibly could have been drawn from the original dataset. Basically the machine is able to generate realistic examples.
  17. ibid
  18. "Artificial Intelligence To Create 58 Million New Jobs By 2022 ...." 18 Sep. 2018, Accessed 8 Jan. 2020.
  19. "This is how India can become the next Silicon Valley | World ...." 2 Oct. 2019, Accessed 8 Jan. 2020.
  20. "Indian Navy's AI-fuelled Transformation To Catch Up With ...." 17 Apr. 2019, Accessed 8 Jan. 2020.

(The paper is the author’s individual scholastic articulation. The author certifies that the article/paper is original in content, unpublished and it has not been submitted for publication/web upload elsewhere, and that the facts and figures quoted are duly referenced, as needed, and are believed to be correct). (The paper does not necessarily represent the organisational stance... More >>

Image Source:

Post new comment

The content of this field is kept private and will not be shown publicly.
6 + 2 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Contact Us