![]() When they get to the wall, they hit it hard. Data systems that reach their scale limits often do not slow down gracefully.Lack of skills and scale make it difficult to build unique data-driven assets based on your data.Data stored in separate systems need to be combined for reporting, causing time-consuming and costly data movement and data duplication challenges.The application that needs to combine structured operational data with unstructured data in document stores to produce searchable, geo-referenced real-time analytic insights will be highly complex if it must stitch together 10 disparate technologies.Rather than fuel digital transformation, data intensity seems to choke it. ![]() Instead of focusing on how to get the most from data, the challenges surrounding the data create massive bottlenecks. Building your own predictive models should increase differentiation and enable a better customer experience through personalization.Īlas, that is not the experience in many organizations. The software-defined technologies should make processes programmable, eliminate risk, and make the organization more adaptive. An increase in data literacy should result in better decisions. As focus shifts from operating data centers to being data-centered, the rate of innovation should increase. Increased data intensity should be a good thing. The data intensity of an organization increases as it manages a greater diversity of data (e.g., by volume, type, speed), becomes more data literate, adopts more data-driven technologies (e.g., data integration, data flows, no-code ELT), and builds its unique data-driven content (e.g., predictive models). The concept of the data intensity of applications extends to data intensity in organizations. When you worry about auto-scaling the application database, adding real-time text search capabilities to a mobile app, adding recommendations based on click-stream data, or managing data privacy across cloud regions, then your application has become more data-intensive. The underlying compute infrastructure is still essential, but automated provisioning and deployment, infrastructure as code, and auto-scaling of resources ease computing concerns. This shift follows the familiar pattern towards data technologies. In data-intensive applications, the data becomes the primary challenge and concern. In compute-intensive applications, you worry about CPU, memory, storage, networking, and infrastructure for computation. In his book Designing Data-Intensive Applications, Martin Kleppmann distinguishes data-intensive from compute-intensive applications, depending on the nature of the primary constraints on the application. Both the colloquial and the theoretical definition of intensity are useful in our context, although we will not attempt a mathematical formula of data intensity.ĭata intensity is about the attributes and properties of the data such as volume, velocity, types, structure, and how you transfer the energy in the data into value. More colloquially, intensity is understood as a high degree of strength or force. For example, sound intensity is the power transferred by sound waves per unit area. In physics, intensity is the magnitude of a quantity per unit. The increase in tech intensity we experience today is an increase in data intensity. Computerized knowledge systems still have their place - who would want their taxes done by a neural network trained on last year’s returns?īut as the world transforms digitally and turns into data, data-driven technologies are a logical consequence. Underneath a software-defined network, there are wired computers somewhere. The shift towards data technologies does not eliminate the other stages of technology innovation. Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |