How ALICE Works
Explore the ecosystem of digital services
The ALICE Platform
*Data assets are built, tested and designed in the ALICE lab to be scaled and productionized through the Data Pipeline.
ALICE applies intelligent automation, data science, human and artificial intelligences (collectively procedures) to data assets to avail digital services in a continuous and near real-time manner. The powerful ALICE Lab promotes structured collaboration of teams, data, intelligence and skills allowing you to create procedures in a quick and easy manner.
Procedures are designed, built and tested in the ALICE Laboratory (“ALICE Lab”) and then applied to data sets acquired through the Data Pipeline to derive outcomes. This is the ALICE digital service.
Through the Data Pipeline, ALICE collects, ingests, engineers and stores disparate and siloed sources of data in accordance with authoritative data source requirements and sound data management practices.
The ALICE Laboratory
The ALICE Laboratory (“LAB”) is ALICE’s wonderland where ideas around digital services to govern, manage and monitor your business come alive. In the Lab, you can author your own procedures to meet your business needs.
ALICE’s intelligent automation accelerates your digital journey through the Lab. The Lab facilitates the design, build, testing and simulation of digital services in a user-friendly way, prior to publishing them on the ALICE platform for consumption by your organization.
A data asset is source data collected, validated and engineered into usable, structured and consistently formatted information. The Lab allows for the enrichment, integration and/or consolidation of data assets to be ingested via the Data Pipeline and to which procedures will be applied.
Human intelligence encompasses logic, subject matter expertise, assumptions, business rules, outcome expectations, professional skepticism and judgement. The power of ALICE lies in translating this human intelligence into storable and reusable algorithms.
Data science incorporates data analytics (descriptive, diagnostic, predictive and prescriptive) and data modelling. ALICE allows data scientists within organizations to combine their skills with other cognitive services using large data assets in a scalable, auditable and secure manner on managed infrastructure.
Organizational context allows the formula to be adjusted for customizations or nuances, specific to a specific business. By enabling a user to consider organizational context, the power of your digital ecosystem is exponential.
Artificial intelligence includes a range of services on the cognitive spectrum, including natural language processing, optical character recognition, fuzzy logic, anomaly detection, machine learning (supervised and unsupervised), deduplication, historical and contextual modelling, etc.
Procedures applied to data assets (scaled and productionized via the Data Pipeline) that result in outcomes are what ALICE calls her digital services. These digital services allow you to govern, manage and monitor your business. This is when ALICE lights up!
The Data Pipeline
The data pipeline collects, ingests, engineers and stores disparate and siloed sources of data in accordance with authoritative data source requirements and sound data management practices. It is architected to handle data in a secure, scalable and auditable manner whilst maintaining the integrity of the data. Each component of the data pipeline is customizable based on fit-for-purpose requirements. The data pipeline components are monitored for health and performance measures.
Data is collected from disparate sources through custom-built ALICE connectors, cloud connectors, robotic processing automation bots or manual uploads that interact with ALICE through APIs. Regardless of the type or content, connectors are configurable to run on a scheduled basis. Translation of information from an unstructured data source, for example, an image or document, is performed by a connector prior to ingestion of such data.
Ingestion & Validation
ALICE performs incoming schema validation prior to ingestion to minimize errors and to ensure consistency of data formats. Ingestion is required to be validated before the data can be further engineering (wrangled) in order to ensure that the incoming data is both usable as per the customer requirements and consistent as per the setup requirements. ALICE has in-built logic and algorithms to maintain a fast ingestion process without sacrificing the need for quality as well as ability to advise on corrective measures on common validation problems. In addition to the incoming schema validation, ALICE performs security validation on the incoming data.
Data engineering in the form of cleansing, enriching, integrating and/or consolidating disparate and siloed data are performed on different ingestions – no matter the size. Ingested data can be used to build more than one data asset for use in one or many digital services. Additionally, other data science toolsets are available to be applied to the ingested data, for example, machine learning or cognitive computing.
ALICE’s data is stored in the cloud in a secure and lossless manner with scalable performance and storage. The intended target data structure post ingestion includes the required data contents, validated by type and expectations with the appropriate tagging of fields for use in metrics, procedural filters and context flags as well as the record of steps taken to achieve this. This also allows ALICE’s stored data to be auditable and searchable among other rich features on well-structured information.
A digital service is when procedures are applied to data assets resulting in outcomes. Outcomes need to be documented, visualized and communication in order for the digital service to help you govern, manage and monitor your business. ALICE comes with in-built working papers and dashboards. However, her data points can be availed and/or integrated into other visualization and/or governance tools of choice. ALICE also communicates with her users on a regular basis in the form of alerts or notifications.
Maintenance of the ALICE platform is limited. If ALICE is provided with data, then there is no actual maintenance owing to the platform service. If ALICE connectors (on-premise) are used, automated updates may be activated using a client selected schedule.