CloudDigital Transformation

Data-intensive intelligent applications in a hybrid cloud blueprint

Introduction

Almost every action we perform generates data. From simple daily tasks like buying groceries or booking a trip, to complex operations like operating manufacturing plants and aircraft, organizations across the globe produce, capture, distribute, and store data. This detail covers the stages of a dataintensive intelligent application life cycle. From start to finish, every aspect of this life cycle has specific requirements regarding skills, tools, and infrastructure. All the stages need to be connected in today’s dispersed deployment environments, potentially spanning multiple cloud providers, in-house datacenters, and edge devices. This detail provides a Red Hat® view of the methodology and technology required to build, maintain, and manage these complex hybrid cloud environments.

The life cycle of data-intensive intelligent applications

While single data points can seem insignificant, the combination of data, or events, from a single source or a collection of event sources can be used to infer higher-value information. (For more on this subject, see Event-driven architecture for a hybrid cloud blueprint). This kind of information can help organizations make better business decisions, deliver better customer experiences, pre-empt problems, and remain competitive (e.g., fraud prevention, next-best action, clinical decision making). By treating each incoming data point as an event, organizations can apply decision management and machine learning (ML) inference techniques to filter, process, qualify, and combine events to deduce higher-order information. The availability of this information paves the way for the development of intelligent applications that can offer more context-aware and personalized services to end customers, applications, and systems. We refer to these types of applications as data-intensive intelligent applications.

The life cycle of this type of application includes various stages:

• Data ingestion: Intake, pre-processing, and transportation

• Data engineering: Storage and transformation

• Data analytics: Data analysis and model training

• Runtime inference: Model serving and monitoring

• Business events and insight management: Event management, insights, and process and integration management

These stages have their own characteristics and challenges.

    Full Name

    Business Email

    Phone

    Company Name

    Job Title

    By receiving this content your name, business e-mail address, title, and certain other professional information maybe passed onto Tech IQ Papers & Red Hat Sponsor for fulfilment of the content access request and follow up by e-mail or phone regarding the content. I understand I can unsubscribe at any time.

    Red Hat may use your personal data to inform you about its products, services, and events - by email. You may withdraw your consent at any time (see Privacy Statement for details).

    Red Hat may use your personal data to inform you about its products, services, and events - by phone. You may withdraw your consent at any time (see Privacy Statement for details).

    View Privacy Policy