New Snowflake report unveils average of 90 AI Apps built per day in 2023
Large language models (LLMs) are increasingly being used to create chatbots, according to Data Cloud company Snowflake. As generative AI continues to revolutionise the industry, chatbots have grown from being approximately 18 per cent of the total LLM apps available, to now encompassing 46 per cent as of May 2023 — and that metric is only climbing. In addition, after surveying Streamlit’s developer community, it was found that nearly 65 per cent of respondents noted that their LLM projects were for work purposes, signalling a shift in the importance of harnessing generative AI to improve workforce productivity, efficiency, and insights.
These results are based on usage data from more than 9,000 Snowflake customers, and summarised in Snowflake’s new “Data Trends 2024” report. The report focuses on how global enterprise business and technology leaders are leveraging resources such as AI to build their data foundation and transform future business operations. The new data shows a shift from LLM applications with text-based input (2023: 82%, 2024: 54%) to chatbots with iterative text input, offering the ability to have a natural conversation.
“Conversational apps are on the rise, because that’s the way humans are programmed to interact. And now it is even easier to interact conversationally with an application,” explains Jennifer Belissent, Principal Data Strategist at Snowflake. “We expect to see this trend continue as it becomes easier to build and deploy conversational LLM applications, particularly knowing that the underlying data remains well governed and protected. With that peace of mind, these new interactive and highly versatile chatbots will meet both business needs and user expectations.”
Over 33,000 LLM Applications in Nine Months
The report also shows that 20,076 developers from Snowflake’s Streamlit community of developers have built over 33,143 LLM apps in the past nine months.
When it comes to developing Artificial Intelligence (AI) projects, Python is the programming language of choice due to its ease of use, active community of developers, and vast ecosystem of libraries and frameworks. In Snowpark, which enables developers to build apps quickly and cost-effectively, the use of Python grew significantly faster than that of Java and Scala (in the past year)— Python grew by 571 per cent, Scala by 387 per cent, and Java by 131 per cent. With Python, developers can work faster, accelerating prototyping and experimentation—and therefore overall learnings as developer teams make early forays into cutting-edge AI projects.
In terms of where application development is taking place, the trend is towards programming LLM applications directly on the platform on which the data is also managed. This is indicated by a 311 per cent increase in Snowflake Native Apps – which enables the development of apps directly on Snowflake’s platform – between July 2023 and January 2024. Developing applications on a single data platform eliminates the need to export data copies to third-party technologies, helping develop and deploy applications faster, while reducing operational maintenance costs.
Data Governance in Companies is Growing in Importance
With the adoption of AI, companies are increasing analysis and processing of their unstructured data. This is enabling companies to discover untapped data sources, making a modern approach to data governance more crucial than ever to protect sensitive and private data.
The report found that enterprises have increased the processing of unstructured data by 123 per cent in the past year. IDC estimates that up to 90 per cent of the world’s data is unstructured video, images, and documents. Clean data gives language models a head start, so unlocking this untapped 90 per cent opens up a number of business benefits.
“Data governance is not about locking down data, but ultimately about unlocking the value of data,” said Belissent. “We break governance into three pillars: knowing data, securing data and using data to deliver that value. Our customers are using new features to tag and classify data so that the appropriate access and usage policies can be applied. The use of all data governance functions has increased by 70 to 100 per cent. As a result, the number of queries of protected objects has increased by 142 per cent. When the data is protected, it can be used securely. That delivers peace of mind.”
“Taken individually, each of these trends is a single data point that shows how organisations across the globe are dealing with different challenges. When considered together, they tell a larger story about how CIOs, CTOs, and CDOs are modernising their organisations, tackling AI experiments, and solving data problems — all necessary steps to take advantage of the opportunities presented by advanced AI,” said Belissent. “The important thing to understand is that the era of generative AI does not require a fundamental change in data strategy. It does, however, require accelerated execution of that strategy. It requires breaking down data silos even faster and opening up access to data sources, wherever they may be in the company or across a broader data ecosystem.”