Thomas Siebel in his book “Digital Transformation” clearly mentioned that the technologies that propel digital transformation are game-changing. Still, we are in the very early stages of this new era. He suggests that organizations must learn new technologies to make well-informed decisions. According to Tom, there are four key technologies that both drive and enable digital transformation – Cloud Computing, Big Data, AI and Iot. We will briefly touch on those areas in this section.
The introduction of widespread cloud computing has democratized data collection and increased the capacity of enterprises, allowing companies of any size to forgo the need for costly IT infrastructures and cumbersome maintenance regiments.
By moving most services to the cloud, businesses can stay nimble and better manage scale than ever before. The lowering prices of cloud computing have led to the rapid growth of “as-a-service” systems that have given smaller companies access to tools that were previously far too costly. Amazon Web Services (AWS), Google, Microsoft, and Alibaba are the biggest cloud titans battling it out for supremacy over the market.
With Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) systems all gaining popularity, cloud computing will be one of the defining elements of the next decade. A recent report from Forrester said the public cloud market will reach $411 billion by 2022. The report added that the four leading cloud vendors will generate 75% of the entire $75.4 billion global public cloud infrastructure market.
Big Data is considered as the crucial second enabler of Digital Transformation. Big data has one or more of the following characteristics: high volume. high velocity or great variety. Artificial intelligence, mobile, social and the Internet of Things (Iot) are driving data complexity through new forms and sources of data. For example, big data comes from sensores, devices, video/audio, networks, log files, transactional applications, web and social media – much of it generated in real time at enormous scale.
In “ Digital Transformation” book, Thomas Siebel says:” Enabled by Cloud Computing, a generation of AI is being applied in an increasing number of cases with stunning results. And we see Iot everywhere – connecting devices in value chains across industry and infrastructure generating Terabytes of data every day. Yet few organizations have the know-how to manage, let alone extract value from, so much data. Big Data now pervade every aspect of business, leisure and society. Companies now face their own Oxygen Revolution : The Big Data Revolution. Like oxygen, big data are an important resource with the power to both suffocate and drive revolution. During the Great Oxidation Event, species began to create new channels of information flow, use resources more efficiently , and mediate connections previously unheard of, transforming oxygen from a lethal molecule into the source of life. Big Data and AI, along with cloud computing and IoT promise to transform the technoscape to a similar degree” (Tom Siebel, Digital Transformation, 2019 page 9)
AI works by combining large amounts of data with fast, iterative processing and intelligent algorithms, allowing the software to learn automatically from patterns or features in the data. AI is a broad field of study that includes many theories, methods and technologies, as well as the following major subfields:
- Machine learning automates analytical model building. It uses methods from neural networks, statistics, operations research and physics to find hidden insights in data without explicitly being programmed for where to look or what to conclude.
- A neural network is a type of machine learning that is made up of interconnected units (like neurons) that processes information by responding to external inputs, relaying information between each unit. The process requires multiple passes at the data to find connections and derive meaning from undefined data.
- Deep learning uses huge neural networks with many layers of processing units, taking advantage of advances in computing power and improved training techniques to learn complex patterns in large amounts of data. Common applications include image and speech recognition.
- Cognitive computing is a subfield of AI that strives for a natural, human-like interaction with machines. Using AI and cognitive computing, the ultimate goal is for a machine to simulate human processes through the ability to interpret images and speech – and then speak coherently in response.
- Computer vision relies on pattern recognition and deep learning to recognize what’s in a picture or video. When machines can process, analyze and understand images, they can capture images or videos in real time and interpret their surroundings.
- Natural language processing (NLP) is the ability of computers to analyze, understand and generate human language, including speech. The next stage of NLP is natural language interaction, which allows humans to communicate with computers using normal, everyday language to perform tasks.
IoT (Internet of Things)
This is the fourth technology that is driving the digital transformation. The IoT refers to the billions of physical devices around the world that are now connected to the internet, collecting and sharing data. Pretty much any physical object can be transformed into an IoT device if it can be connected to the internet and controlled that way. Statistica says: “We expect to see 75 billion internet-connected devices by 2025” These things are not general-purpose devices, such as smartphones and computers but dedicated-function objects such as vending machines, jet engines, connected cars and a myriad of other examples.
- Joon Solutions Achieves the Data Analytics Partner Specialization in Google Cloud Partner Advantage Program - May 27, 2022
- Top 5 Digital Transformation Trends in 2021 and Beyond - September 21, 2021
- 6 Biggest Data Strategy Mistakes Every Company Must Avoid - September 13, 2021