Big Data Changes Over the Next Decade

Where will society be in 10-15 years as a result of big data analytics?  By 2025, it is estimated that the Internet of Things will exceed 100 billion connected devices which will drive a trillion-sensor economy.  This growth in big data will likely result in a jump from about three billion to potentially more than eight billion connected lives. This sensor-based economy will produce information that will allow patterns to be detected that were never before discoverable. With the future integration of 5G and the Internet of Things, there will be a need for new networks that can support trillions of devices with high reliability, quality, and data transmission rates.  With this integration, it is expected that there will be more in-depth coverage, lower costs, ultra-low energy usage, robust security, and high availability.

Big data analytic tools will continue to grow to allow those without programming skills to do more sophisticated analysis with ease across both structured and unstructured data.  In this sense, the role of the analyst will change as big data analytics become more part of everyday business. Also, as children in the past used calculators with ease to perform simple calculations, in the future, big data sets will be able to be leveraged with visual tools so that even children can utilize big data in making sense of the world.

Overall, there will be more data at our fingertips than ever before was possible.  What questions we seek to understand will determine the usefulness of that data in helping to advance our society.  Patrick Gelsinger shares, “Data is the new science.  Big data holds the answers.  Are you asking the right questions?”

#BigData #FuturePredictions #Analysis

Understanding Lady Luck

Lady Luck is a western expression of the Roman goddess of luck, Fortuna.  Fortuna’s decisions on whether to provide good or bad luck have been debated for centuries.  Some of the best innovations we have today have been a result of an error, exaptation, and serendipity.  However, when we think about the future, it is a visioning game of what could happen, not necessarily knowing what will happen.

The x-ray, which was invented by a physics professor at 50 years old, was the result of an error.  He was working on the cathode-ray-tube and noticed when he held a thick paper in front of it that there was light was produced by the material near the tube which ultimately led to the discovery of the x-ray.  Today it is common to use x-rays to look at broken bones, detect pneumonia and detect cancer.

Scientific breakthroughs have also happened from a result of exaptation. An example is a current work at a company called GroundProbe.  GroundProbe is a technology radar equipment that is very precise over a short distance.  At first, the founders thought that this technology would be used to find underground pipes and power cables.  But, when they tested it with consumers, there was not much demand.  After additional market testing, they discovered it had more valuable use to monitor the stability of rock walls in mines to preempt dangerous emergency evacuations.

At the Imperial Festival in April 2018, the Department of Electrical and Electronic Engineering student, Sharlyn Doshi seeks to explain better the optimum psychological and environmental conditions for serendipitous discovery.  Her study of the Human-Computer Interaction is one of many studies happening across the world that examines serendipity and approaches in the field of computer science.  Her research interest is not surprising given that many other scientific discoveries have also been made by luck including the microwave oven, penicillin, Velcro, and even chocolate chip cookies!

While good fortune has a role in scientific discovery, Doshi’s research work and others will continue to link the steps leading to Lady Luck. Error, exaptation, and serendipity seem to favor the prepared mind. While the story of the apple falling on physicist Isaac’s Newton’s head is a myth, there is some evidence that the falling of the apples from the tree may have contributed to him thinking about gravity.  However, the reality is that there are several likely factors involved in helping us see the world in a new way from favorable timing, having the right tools and creative thinking.  Lady Luck may favor those that dance near curiosity and opportunity.  In the meantime, don’t lean too hard on luck for your next technological breakthrough or it may evaporate.

 

 

R Programming Leads the Way

 

Today over 2 million people across the globe are using R programming to analyze large data sets with ease. After decades of using SAS, Matlab, and ACL, this programmer recently converted to using R for data analysis, mainly because when running a small business, it is essential to find ways to save money, especially in Information Technology. One of the most compelling things about R is that it is free and has a library of scientific algorithms to support big data analysis and visualization. The Spectrum Survey by IEEE described the R programming language the “king of statistical computing languages for analyzing and visualizing big data.” With R being an open source language with code that can be used on all platforms including Windows, Mac, and Linux, it helps level the playing field regarding access to big data analytic tools.

The R Suite is available online and can be downloaded rather quickly. R has standard data management tools built-in for accessing a variety of data from performing basic summaries and joins, but also more complicated statistical analyses like Tree models and ANOVA. Also, there is an active community of R users online and free videos to help the new professional quickly get up to speed. When the Excel spreadsheet does not work anymore because there are over 2 million rows of data to be analyzed, R can provide a timely solution.

According to a recent Gartner report, artificial intelligence (AI) is estimated to create 2.3 million jobs by 2020. The job search site Indeed found that employers demand for AI skills has more than doubled over the past three years. Companies are searching for candidates that know how to use big data analytics tools like R to support their business and artificial intelligence growth goals. While artificial intelligence has undoubtedly taken away jobs, some of the jobs that are added back into the economy include the need for those with R programming skills. The average pay for data scientists with R skills is $115,531 per year.

 

R programming is a vital tool for many big data scientists in some of America’s largest companies. For example, Twitter’s Data Science toolbox uses R to help define their customer service interactions. Facebook uses R to analyze status and profile pictures to identify key relationships. Microsoft and Nordstrom, and many others are using the R language to identify data patterns and develop targeted solutions to fundamental business problems. Google uses R to determine the effectiveness of its advertisements. And, The New York Times and The Economist have used the visualization tools found in R in their publications. Also, a recent study found that 80% of Airbnb data scientists use R, and 64% of them use it as their primary analysis tool.

 

While programming has changed over the years, the fundamentals have stayed the same. A problem is defined, a solution is identified, a program is selected to use, a debugging process happens, and there is development and testing of the program. Big data programming is an essential piece of evolving technologies like voice-recognition and artificial intelligence.  R is a widely used tool in the big data industry with free access across the globe that supports everything from big data analytics to artificial intelligence modeling. To get started in downloading R, click here: https://www.r-project.org/

#R #BigData #Programming #ArtificialIntelligence

The Future of Higher Education in a Digital, Skills-Based World

 

As new high school students are entering colleges and universities both online and on campus, the value of higher education is under attack as Americans question the return on the investment. Our beloved institutions are expected to produce graduates that can function in a variety of environments of ongoing learning, adapting quickly to technological changes, problem-solving, innovating and having appropriate human-centered skills. Within the confines of the ivory towers, administrators worry about attracting and retaining faculty, the rising costs of deferred maintenance and much-needed investments in technological infrastructure. Universities are at a tipping point as their traditional academic and research models become less viable. Furthermore, there is increased competition, soaring tuition costs and escalating pressures of accessibility, relevance, and practical research. As recently featured in the Wall Street Journal, employers are thinking about new ways to hire, and some are even removing four-year degree requirements from their job descriptions. Not unlike the newspaper business, those in higher education need to adapt to survive.

A dynamic force at play here is the changing relationship between higher education and employment. The world is evolving away from having life-long relationships with employers who define an employee’s sole identity. Loyalty to one corporation has become more of a reflection of the past as retirement benefits change and companies downsize, merge, collaborate and restructure. Cutbacks in benefits and changes in tax structures globally are accelerating increased risk to the individual employee in the future. Furthermore, digital advances have transformed the way we think about our workplaces and demonstrated skill sets. Technology has changed the types of work available for people as well as the expectations of new hires.

To further complicate the situation, some employers argue that professionals are not leaving higher education institutions with the necessary set of skills to enter the workforce. The reality is that it may be more common in the future for workers to not have permanent jobs. Contracts, skills-based work, free-lance, part-time, and adjunct positions are becoming more commonplace today than ever before. This shift is requiring a new skill set that future generations will need to embrace strategically over time to be successful. We are in desperate need to bring the future of skills to the future of work in America which puts new pressures on our legacy educational systems. Furthermore, as burnout and boredom in aspects of corporate life accelerate, the drive towards engaging and meaningful work becomes more relevant.

There is an ongoing academic debate around if universities exist to provide skills-based training or if they exist to deliver social impact for the general good of society or something else altogether. Given the rapid pace of digitalization and globalization, it is imperative that all leaders in higher education continue to get their hands in the mud in experimenting with these trends to protect society’s future.

 

Some higher education institutions have already started to reshape incentives to support faculty research that responds to real-life challenges. Social impact learning has the benefit of positioning students and educators to serve the public good in a scalable way to deliver sustainable value. Whether it is based in solving the social problems of local communities or addressing real-world business challenges, this is the era of transferring new knowledge into workable skills. For example, at Cornell University there is currently an evidence-based research partnership to address the opioid crisis in update New York. The university is working on a real-life problem that addresses the community concerns.

 

From a skills-based perspective, University of Virginia School of Engineering’s Link Lab has a graduate program for students to make discoveries and translate their knowledge into new technologies, products, and services. They place a specific emphasis on the Internet of Things and big data innovations in solving large-scale, societal problems. At Penn State, graduate students are getting early clinical experience with a web-based program based on real experiences by clinicians. “What this program does is provide graduate students with guided learning on simulated cases before working with real patients,” said Anne Marie Kubat, Director of the Speech and Language Clinic at Penn State.

Colorado Schools of Mines is looking into the future of work and offering the first graduate degree program supporting mining in outer space.

Given that taxpayers subsidize some large institutions, academic research is becoming increasingly focused on the uses in the outside world that are timely, relevant or accessible. As the curriculum continues to adapt to the skills-based economy, key questions for administrators include:

•   What are the skills that are worth developing? How are these explicitly stated and communicated to students?

•   How well do these skills match those required by businesses? Do we need to add or adapt existing skills to make them more relevant?

•   What teaching methods are most likely to lead to the development of these skills?

•   What are the opportunities for practice and feedback on the development of the selected skills?

•   How are such skills accessed? What are the desired outcomes?

Scenario planning can be a useful tool for institutions facing these complex challenges by identifying significant events, leading actors and their motivations. Leveraging scenario planning tools during strategic planning retreats, for example, can account for the social impact of change by creating different versions of the future that hedges changes in social preferences.  The reality is that each institution faces unique challenges and a wide range of future scenarios. But, by talking to key stakeholders about the future and ranking each scenario, prioritization is possible.  Scoping the key issues to address is important to identify relevant trends, outcomes, options, and actions. However, it must be stressed that there is a tendency to believe the future is an extension of the past but through proper scenario planning it is possible to examine these large-scale forces that can push the future into different directions. Some of the forces in addition to the ones listed above include demographic and lifestyles, economics, politics, environmental factors, and technology. A useful tool is in identifying the main driving forces behind trends is mapping the scenarios. The example shown below was created by Edinburg to help inform the future of e-learning.

 

While this graph was built originally in 2008, as Dr. Wade shares, “Some of these case studies describe scenario planning exercises that took place a few years ago. In those instances, we already have some idea which ‘future’ is materializing now, and with the miracle of 20/20 hindsight, we can see how the planners may have missed certain developments that turned out to be important. But we can also see how prescient some of the scenarios were, mapping out a ‘big picture’ that comes close to what is actually transpiring, even if a few of the details are wrong.”

Legacy institutions will need to adapt their business models to be more flexible and relevant in light of emerging trends. As the evolution of higher education is underway, leaders will need to find new methods to encourage and inspire minds during these transition periods to position graduates for success in a global environment. As Drs. De Bonte & Fletcher share, “Perseverance is required if you intend to implement change at scale. Fixing products is easy. Fixing the processes and organizations that build them is hard.” While colleges and universities can sometimes have a reputation for being resistant to massive change, it is time to step out of the comfort zone to ensure the investments in our beloved higher education institutions have relevance in the future. Staying focused on the real-world needs of students is a step in the right direction.

#HigherEd  #SkillsBased  #Digitalization #ScenarioPlanning

Apple Predicted Siri Over 30 Years Ago: What Drives Their Success?

 

The future is uncertain. However, companies use forecasting to try to predict or estimate an event or trend. The prediction is a statement about an incident that has not yet taken place. In technology, there have been plenty of prophecies which have not come true. In 1977 the President of Digital Equipment Corp. said, “There is no reason anyone would want a computer in their home.” On the other hand, some futurists were surprisingly accurate.

For example, in 1987, Apple shared the future of computing and called it the “knowledge navigator” which was a touchscreen device with video calling and a built-in personal assistant.

 

They also estimated the date the new product would come to market as for September 16th, 2011 which is 18 days off from when those features were available in the marketplace.  Former Apple CEO John Scully predicted Siri over 30 years ago.

One dominant force in play that helped Apple’s success is the focus on the customer that has both a personal and professional life. Back when the iPhone first came out, most people had a BlackBerry for work, and some had the iPhone for personal use. Apple did a good job staying focused on the adage, “The customer defines value.”

Instead of trying to entirely replace Microsoft Outlook which at the time was used in 81 of the Fortune 100 companies, they knew they had to meet customers where they were at to take the company to the next level. In 2007 when the iPhone came out, Apple secretly licensed Exchange ActiveSync technology from Microsoft so that users could sync email, contacts, and calendars between their phone and work. Then, when Apple eventually gained the desired market share, they were careful to protect their proprietary technology versus enabling cross-platform compatibility.

Another force driving Apple’s success is that their vision was backed by exceptional marketing. Market campaigns focused on “groundbreaking” and “revolutionary” new technology around the user experience. Whether it was always so “groundbreaking” may be a matter of debate, but that is their consistent brand. Apple uses phrases like “superior user experience” to distinguish itself from its competitors. Market research has shown that their marketing campaigns have resonated with their customers.

Apple’s history is worth studying, especially given that its stock up more than 30 pct in 2018 making it the first publically traded company in the United States to hit a market value over $1 trillion.

#Apple #Forecast #Predictions

AI-based Scenario Planning

Traditional forecasting uses past observation to predict the future to improve customer engagement, increase sales, reduce inventory and improve productivity. While it is simple to understand, the challenge is that this method too often neglects the precision and agility needed when making important decisions. The past does not always equal the future as environments are complex and chaotic. Shape trend lines lose value as they force moving averages and smooth out variability until we have an over-simplified the world that may not be an actual reflection of reality. Luckily, there are some more sophisticated tools today that produce future insights through the power of predictive simulation with time-based KPIs to enhance well-informed decision making. Moving away from over-simplified assumptions helps companies develop models that are complex, probabilistic and can incorporate potential failure events.

Scenario planning provides a framework to develop policies in the face of uncertainty. The scenario is merely an account of the reasonable future. Scenario planning contrasts different scenarios to explore the uncertain future consequences of a given decision. It is vital in scenario planning to have a clear purpose. Necessary steps include assembling a diverse group of participants to collect, discuss and analyze scenarios. When artificial intelligence is incorporated in the tool, these diverse group of participants should be used in the early stages to help define the inputs into the program. Companies across the world are using these tools today to make decisions based on what they think will happen in the future. The benefits of using scenario planning techniques include increasing understanding of important uncertainties, incorporation of alternative perspectives into planning and greater resilience around decisions to surprise.

For example, IBM uses a scenario building tool that uses artificial intelligence to support risk management activities in the areas like security and finance. The tool can generate alternative scenarios of the future and predict outcomes including both likely and unlikely futures. Both structured and unstructured data like relevant news, social media trends and domain knowledge can be inputted to paint the current state that can generate scenarios explaining the key drivers and related futures.

Another example includes the Stena Line that just announced the use of artificial intelligence on its ferries. The model predicts the most fuel-efficient way to operate a vessel on a route taking into account alternative perspectives of the future. Jan Sjostrom shares, “Planning a trip and handling a vessel in a safe and, at the same time, fuel-efficient way is craftsmanship. Practice makes perfect, but when assisted by AI a new captain or officer could learn how to fuel optimize quicker. In return, this contributes to a more sustainable journey.”

From transportation to healthcare, companies are leveraging machine learning engines that analyze a wide-variety of data and present a range of scenarios for optimization. With sophisticated tools that can learn and improve, there is a shift away from unfocused, historical data mining, to iterative, time decision analysis that can provide more useful insights for the future.

#PredictiveModeling #AI #ScenarioPlanning #MachineLearning

 

Robots Learning from Their Mistakes

The truth is that not all mind-blowing inventions that have transformed the world came to be because of strategic focus on a team goal. Some scientific discoveries were made by complete mistake. For example, the printer was invented when a Canon engineer by accident placed his hot iron on a pen. After he put his hot iron on the pen, the ink came out of it. This discovery became the foundation for the inkjet printer. Many famous inventions happened because what was supposed to happen, did not happen.

A well-known technological breakthrough occurred when an assistant professor at the University of Buffalo thought he ruined his science project. He was supposed to use a 10,000-ohm resistor to do the right thing in alignment with the strategic focus of the project. Carelessly, he used a 1-megaohm resistor instead. But, the circuit did something different as a result of this mistake. It made a signal that ran for 1.8 milliseconds and then stopped and started again. It turns out that this same pattern works well with the human heartbeat. His error resulted in the creation of a small circuit that reduced the size of the pacemaker from being the size of a large, heavy television, to ultimately weighing only as much as two paper clips.

 

Considering that almost 4 million people today have pacemakers implanted in their chest and tens of thousands are placed in patients each year, the assistant professor’s failed project seems pretty important given that it has resulted in countless saved lives.

Fire has amazed humans for centuries. The need to control fire has distinguished humans in their ability to prepare food and survive hostile environments. In 1826 a British pharmacist was frustrated trying to clean a utensil unsuccessfully. There was a big lump on the end of his stick and when he tried to scrape it off. The only problem was instead of getting the glob off, it ignited. This troublesome error resulted in the first strikeable matches that were then sold in a bookstore.

 

Between 1827 and 1829, this mistake resulted in the sale of 168 matches. By 1858, as the invention was refined, there were manufacturing capabilities to create 12 million matchboxes a year. While Butane lighters are more popular today, approximately 500 billion matches are still used annually in the United States alone as a result of the glob that would not come off the stick.

The famous adage is to learn from mistakes. These inventors not only reflected on their errors but transformed the entire world with what they learned. All people face adversity from time to time, but successful people find a way to navigate the roadblocks and even flourish when things get difficult. Take a hard look at your execution when it did not go as planned. It is important to process what happened and then resume moving forward.

Some researchers have recently taken this concept of success in failure to the next level in programming robots. For the first time, computer scientists are making progress in developing artificial intelligence that can learn from its own mistakes. The development comes from an open-source algorithm called Hindsight Experience Replay (HER) which was released in March 2018. This technology views failure as success and attempts to formalize what many successful humans do naturally. According to OpenAI’s blog, “The key insight that HER formalizes is what humans do intuitively: Even though we have not succeeded at a specific goal, we have at least achieved a different one. So why not just pretend that we wanted to achieve this goal, to begin with, instead of the one that we set out to achieve originally?”

Regarding artificial intelligence, the machine either gets rewarded or not based on the experience of achieving a goal. However, it can be viewed essentially as moving the goal post or a having a goal in hindsight. In a sense, these researchers are training the machine how to embrace what was traditionally viewed as “failure” as instead, not stopping, but using that as a teaching moment, and even rewarding the machine for achieving an arbitrary goal in the face of adversity. And, while the artificial intelligence platform has a long way to go regarding this learning, it could be argued that humans have a tough time with this sometimes, too. Failures are a part of learning, as much as successes. It is through learning that we realize vision only before imagined in dreams.

#ArtificialIntelligence #MachineLearning #HumanExperience

The Banking Industry Fuels the Heart of Money with Artificial Intelligence

 

Big data analytics have had a tremendous impact on the banking industry.  Bank of America, Wells Fargo, BBVA, Citi Ventures, and Ally Bank are currently using machine learning programs. According to Accenture, 67 percent of banking employees expect intelligent technologies to create new opportunities for the nature of their future work.  Machine learning insights have provided competitive advantages that have been both lucrative and game-changing.  Executives can leverage their existing operating models to understand where to benefit from strategic artificial intelligence investments.  From fraud detection to customer service, the banking industry is thriving off of business intelligence. Andrew Lo, Director of the Laboratory for Financial Engineering at the MIT Sloan School of Management shares, “I suspect that it’s going to transform all aspects of the financial industry because there are so many parts of it that can be automated using these kinds of algorithms and access to large pools of data.”

Tata Consultancy Service research estimates that within banking industries that artificial intelligence has already helped them reduce costs by 13 percent. Given the breadth and depth of the digital revolution, executives should also involve those with machine learning expertise in strategic planning sessions in the boardroom. To maximize the return on big data investments, it is critical for executives to understand better the role that big data plays in each aspect of their business. If banking executives do not stay current on the big data advancements, it may result in lost market share. Specific areas where big data is being successfully used now in the financial services sector includes fraud detection, marketing, credit risk management, operational efficiency, and customer insight and service.

Fraud Detection

 

McAfee’s recent report estimates that cybercrime costs the global economy $600 billion, or .8% of global gross domestic product with the most common crime being credit card fraud.  Many of these frauds can happen very quickly. Fortunately, systems powered by machine learning are excellent at rapidly reviewing large volumes of data and detecting patterns and outliers that may indicate fraud. Different clusters can be modeled to fit various customers’ profiles and transactions to support predictive modeling embedded with machine learning.  Banks can use advanced algorithms to identify stolen credits cards and fraudulent purchases before the customer is even aware that there could be an issue. In 2016 alone, it is estimated that artificial intelligence algorithms sopped nearly $17 billion in fraud activity. As new algorithms become more sophisticated, neural networks have the potential to reduce future economic losses drastically.

Marketing

 

Big data analytics allows transformative marketing in both personalization and research.  Mass email blasts are a tool of the past and will likely not be effective in upcoming years as customers are feeling overwhelmed by the amount of information they receive. New advanced machine algorithms support automated and real-time customer engagement and even content development when customers tweet, post on Facebook, upload a thought on Instagram or write a blog referencing the company. Machine learning can reach millions of targeted customers in seconds. Also, customer needs and values can be prioritized and matched with strategic communications to ensure the effectiveness of the message. Marketing with machine learning is moving from a “nice to have” tool to an integrated part of the company’s business strategy. Companies are also leveraging new machine learning techniques to study consumers online to inform their brand loyalty and trust strategies better.

Credit Risk Management

 

Over the years, banks have faced increasing scrutiny as it relates to credit standards and poor risk management strategies. Generally speaking, credit risk is when a potential borrower does not meet the lending obligations and terms. Banks want to maximize their risk adjust the rate of return and balance the risk in their portfolios. According to a survey by the Global Association of Risk Professionals, 88 percent of bank executives think machine learning adoption could provide a foundational change for their risk management programs.

With risk and value being different sides of the same coin, it is not surprising that according to the Syncsort survey, more than half of respondents said they already use big data tools to increase revenue and accelerate growth. Banking data is starting to be combined with personal data to improve forecasting for credit card risk management.

Simudyne Technology has a tailored approach in providing banks with computer models that show millions of future scenarios and lets the executives test how individual facts interact in those different scenarios. Other machine learning techniques include leveraging models that incorporate the individual size of the bank and risk tolerance levels to drive new operational processes. For example, some banks are using outstanding customer balances or previous payments and then overlay that data with standard credit bureaus to create data sets to better access individual risk within each data transaction. Banks can leverage machine learning in credit risk portfolios to then raise or lower an individual credit line as appropriate in a way that is in alignment with the bank’s risk tolerance level. As regulation continues to increase, new algorithms will likely be created to determine better the levels of risk involved in the exchange of value, payment and settlement disputes, as well as risk-return policies.

Operational Efficiency

 

According to the Synscort report, over 40 percent of respondents said one of the most significant benefits of big data is the ability to increase business agility. Not surprising that many businesses are using big analytics to improve operations. Machine learning provides the ability to process more transactions while eliminating downtime. For example, Bank of America is already using voice commands to look up account information and transfer money. Robotics Process Automation is automating processes by augmenting human capabilities.  And, Union Bank has already deployed RBA bots that automate ATM transactions. Machine learning is also being used to support more timely decisions on customer loans. The traditional bricks and mortar strategy of banks can be supplemented, if not replaced shortly by user-friendly apps and online solutions.

Customer Insight and Service

 

Since banking institutions already have to be transparent with their data as a result of the General Data Protection Regulation and Payment Services Directive, why wouldn’t these organizations also leverage that same data for competitive insight about their customers to grow their business? According to a NewVantage survey, 53.4 percent of companies have experienced success in improving customer service as a result of their big data analytics program. As financial institutions learn more about their customers with each transaction, there is also the expectation of more personalized customer interaction. Customer expectations have changed as they expect increased insight from the company.

For example, artificial intelligence can be leveraged to respond to commonly asked questions in a timely fashion. Banks are also looking towards machine learning to advise customers better about proper investments and savings recommendations based on their transaction history. According to Matt Meyer, CIO for BrightStar, “It’s about keeping pace with other industries. Ride-sharing apps have been a disruptor to taxi services, and there are lessons to be learned there for the financial industry. Customers want ease of access, they want ease of payment, and they want that speed.”

In summary, sustained investment in big data is critical to keep the financial services industry relevant. In a recent study by Accenture, 77 percent of banks plan to use artificial intelligence to automate tasks in the next three years significantly. “We’ve been working to build out the technology stack at Citi so that we can drive broad adoption of machine learning within various use cases and functions,” said the Managing Director of City Ventures.  Frankly, with all the major players in banking at the table with machine learning, it is no longer a luxury but rather an operational necessity. Those that engage digital innovation experts in their strategic planning sessions in the boardroom are likely to be better prepared for this digital revolution.

Autonomous Next conducted a recent study that showed potential costs savings due to ongoing machine learning programs could result in cost savings of $450 billion across the banking industry by 2030. The evolving digital economy creates new opportunities and threats for traditional banking systems in a variety of areas across their core business. Addressing the existing high-priority banking problems with artificial intelligence and machine learning will not only keep banking organizations relevant but in a time of increased economic concern, if embraced strategically, could help pave their way to market dominance.

#AI #BankingIndustry #MachineLearning