Microsoft StreamInsight provides a powerful platform for developing and deploying complex event processing (CEP) applications. CEP is a technology for high-throughput, low-latency processing of event streams. Typical event stream sources include data from manufacturing applications, financial trading applications, Web analytics, or operational analytics. The StreamInsight stream processing architecture and the familiar .NET-based development platform enable developers to quickly implement robust and highly efficient event processing applications.
StreamInsight has the following key benefits:
Highly optimized performance and data throughput
StreamInsight implements a lightweight streaming architecture that supports highly parallel execution of continuous queries over high-speed data. The use of in-memory caches and incremental result computation provide excellent performance with high data throughout and low latency. Low latency is achieved because the events are processed without costly data load or storage operations in the critical processing path. With StreamInsight, all processing is automatically triggered by incoming events. In particular, applications do not have to incur any overhead for event polling. The platform provides the functionality for handling out-of-order events. In addition, static reference or historical data can be accessed and included in the low-latency analysis.
.NET development environment
Developers can write their CEP applications using Microsoft’s .NET language such as Visual C#, leveraging the advanced language platform LINQ (Language Integrated Query) as an embedded query language. Given the large community of developers already familiar with these technologies, this capability reduces development costs and the time from application development to production. In the current release, StreamInsight supports only C# as the host language.
By using LINQ, developers familiar with SQL will be able to quickly write queries in a declarative fashion that process and correlate data from multiple streams into meaningful results. The optimizer and scheduler of the StreamInsight server in turn ensure optimal query performance.
Flexible deployment capability
StreamInsight supports three deployment scenarios:
Fully integrated into the application as a hosted (embedded) DLL.
As a stand-alone server with multiple applications and users sharing the server. In its stand-alone configuration, the StreamInsight server runs in a wrapper such as an executable, or the server could be packaged as a Windows Service.
The hosted or stand-alone StreamInsight server could be part of a server farm.
The monitoring and manageability features built into the StreamInsight server provide for low total cost of ownership (TCO) of CEP applications. The management interface and diagnostic views that are provided in the StreamInsight server allow the administrator to monitor and manage the CEP application. The manageability framework also allows for ISVs and system integrators to remotely monitor and support StreamInsight-deployed systems at manufacturing and other scale-out installations.
StreamInsight provides a stand-alone event flow debugger that can be used to analyze, diagnosis, and troubleshoot the queries used in StreamInsight applications.
The need for high-throughput, low-latency processing of event streams is common to the following business scenarios:
Manufacturing process monitoring and control
The following sections discuss some of these scenarios and investigate their requirements for event processing.
Manufacturing Process Monitoring and Control
To ensure that products and processes are running optimally and with the least amount of downtime, manufacturing companies require low-latency data collection and analysis of plant-floor devices and sensors. The typical manufacturing scenario includes the following requirements:
Asset-based monitoring and aggregation of machine-born data.
Sensor-based observation of plant floor activities and output.
Observation and reaction through device controllers.
Ability to handle up to 10,000 data events per second.
Event and alert generation the moment something goes wrong.
Proactive, condition-based maintenance on key equipment.
Low-latency analysis of aggregated data (windowed and log-scales).
An optimal customer experience from a commercial Web site requires low-latency processing of user behavior and interactions at the site. The typical click stream analysis application includes the following requirements:
Ability to drive page layout, navigation, and presentation based on low-latency click stream analysis.
Ability to handle up to 100,000 data events per second during peak traffic times.
Immediate click-stream pattern detection and response with targeted advertising.
Algorithmic Trading in a Financial Services Environment
Algorithmic trading, with its high volume data processing needs, typically has the following requirements:
Ability to handle up to 100,000 data events per second.
Time-critical query processing.
Monitoring and capitalizing on current market conditions with very short windows of opportunity.
Smart filtering of input data.
Ability to define patterns over multiple data sources and over time to automatically trigger buy/sell/hold decisions for assets in a portfolio.
The utility sector requires an efficient infrastructure for managing electric grids and other utilities. These systems typically have the following requirements.
Immediate response to variations in energy or water consumption, to minimize or avoid outages or other disruptions of service.
Gaining operational and environmental efficiencies by moving to smart grids.
Multiple levels of aggregation along the grid.
Ability to handle up to 100,000 events per second from millions of data sources.