Freezer Monitoring Tutorial for HANA Smart Data Streaming

Transcription

Freezer Monitoring Tutorial for HANA Smart Data Streaming
Smart Data Streaming – Freezer Monitoring Tutorial
Incorporating Real-Time Event Stream Processing into SAP HANA Applications
TABLE OF CONTENTS
Contents
Before you start .......................................................................................................................................... 4
Setup Notes................................................................................................................................................ 5
Create a streaming project and capture events in HANA .......................................................................... 7
Overview .................................................................................................................................................. 7
Setup ........................................................................................................................................................ 7
Open SAP HANA Studio .......................................................................................................................... 7
Connect to a HANA system ..................................................................................................................... 8
Open the Streaming Perspectives ........................................................................................................... 11
Connect SAP HANA Studio to the Streaming server ............................................................................... 13
Configure HANA Data Service for Streaming connections ...................................................................... 14
Create a new Streaming Project .............................................................................................................. 17
Create the tables needed for this demo ................................................................................................... 18
Create an input stream ............................................................................................................................. 19
Add a filter ................................................................................................................................................ 23
Capture events in HANA table ................................................................................................................. 25
Compile and check for errors ................................................................................................................... 27
Summary .................................................................................................................................................. 27
Join event stream to a HANA table, watch for trends, generate alerts ...................................................... 28
Overview .................................................................................................................................................. 28
Add a HANA table to the streaming project ............................................................................................. 28
Join the event stream to the HANA table ................................................................................................. 30
Add an aggregation window ..................................................................................................................... 34
Generate Alerts ........................................................................................................................................ 40
Summary .................................................................................................................................................. 45
Run and test the project ............................................................................................................................. 46
Overview .................................................................................................................................................. 46
Run the project, play in some data, view results ..................................................................................... 46
Use manual input and event tracer tools ................................................................................................. 49
2
Summary .................................................................................................................................................. 51
Watch for patterns of events; use the CCL editor ...................................................................................... 52
Overview .................................................................................................................................................. 52
Add a Pattern stream ............................................................................................................................... 52
Summary (2) ............................................................................................................................................ 57
Advanced CCL: custom Flex operators .................................................................................................... 59
Overview .................................................................................................................................................. 59
PowerOutage Flex ................................................................................................................................... 59
Notes on the CCLScript on the PowerOutage Flex ................................................................................. 63
Dashboard Flex ........................................................................................................................................ 63
CCL for the Dashboard Flex .................................................................................................................... 64
Extra: attach outputs to HANA tables ...................................................................................................... 65
Summary (3) ............................................................................................................................................ 65
Appendix – Creating a HANA User and Granting Smart Data Streaming Permissions ............................ 66
Creating a New User ................................................................................................................................ 66
3
BEFORE YOU START
Pre-requisites:
 You must have an SAP HANA SPS10 system with smart data streaming installed that you can connect to and
use
 You must have SAP HANA Studio 2 installed with the streaming plug-in installed and configured
 Extract the contents of the “Freezer_Monitoring_Tutorial – Supporting Files” zip file on your local windows
machine (where your HANA studio is installed on)
o Extract on your local windows machine. The content in this will be used later in this tutorial.
 This tutorial assumes you will be using the HANA SYSTEM user to connect to your HANA system and will
refer to it as “SYSTEM”. If you would like to create a different user, the Setup Notes section of this tutorial
outlines the process.
Overview:
In this tutorial, we will be building a simple device monitoring scenario where we receive a stream of events from
sensors on coolers, freezers and vending machines. We filter and monitor the data, capturing data in HANA, and
enriching the raw event data with information from HANA. There are several outputs including:
 alerts when a machine is not maintaining the desired temperature, based on a moving average, to eliminate
brief spikes when the door is held open
 alerts when the power to a machine has been out for more than a specified period of time
 a summary view ("dashboard") that is continuously updated to show the current status of each machine
A copy of the completed project is included in the “Freezer_Monitoring_Tutorial – Supporting Files.zip” file. If you wish
to see the completed project without having to work through the entire tutorial, this completed project can be imported
into HANA Studio.
If you have time constraints, it is recommended that you first take a look at the different exercises and then decide
which ones you want to work through first. However, it is recommended that all users complete at least the following,
before jumping around, since later exercises build on these initial ones:
 Chapter 1 - all exercises up through "Create an input stream"
 Chapter 2 - the first two exercises, completing the join between the event stream and the HANA table
Chapter 1: Create a streaming project and capture events in HANA
Estimated Duration: 30 minutes
Chapter 2: Join event streams to a HANA table, watch for trends, and generate alerts
Estimated Duration: 20-30 minutes
Chapter 3: Run and test the project
Estimated Duration: 15 minutes
Chapter 4: Watch for patterns of events; use the CCL editor
Estimated Duration: 20 minutes
Chapter 5: Advanced CCL: custom Flex operators
Estimated Duration: 30-45 minutes
4
SETUP NOTES
Backend Prep
1. You need user credentials to connect to both the HANA Database and HANA Streaming Server, and
permission to:
a. Create and write to database tables
b. Run streaming projects
Install Freezer_Monitoring_Final Demo
1. Download Freezer_Monitoring_Tutorial - Supporting Files.zip to your HANA Studio machine.
2. Extract the files to your local hard drive
3. Import the CCL project “freezer_monitoring_final” into your studio workspace. This is project shows the
results of the completed tutorial and can be used for reference if you have difficulty working through any
section of the tutorial.
a. With Studio open, click on File and select Import
b. With the Import window now open, expand SAP HANA smart data streaming, select Streaming
Project and click Next
5
c. Browse to the freezer_monitoring_final directory and select Finish
6
CREATE A STREAMING PROJECT AND CAPTURE EVENTS IN HANA
Overview
Estimated time: 30 minutes
Objective
In this set of exercises you will learn how to create a streaming project to receive events, filter the events, and capture
the desired events in a HANA table. We will use the SAP HANA Studio with the Streaming plug-in, and we will use the
visual editor to create a new streaming project.
Exercise Description
 Create a new streaming project: a project contains one or more input streams/windows and then directs the
flow of data through continuous queries and/or custom operators, publishing output events via adapters.
 Define an input stream to receive incoming events: All incoming events enter a project via an input stream or
input window. The schema of the input stream (or window) defines the fields that events on that stream will (or
may) contain.
 Apply a filter to only capture events of interest. Filters match each event against the defined criteria, only
passing those that match.
 Add a HANA output adapter to connect the filtered stream to a HANA table, capturing the events in the HANA
database.
Setup
Open SAP HANA Studio
From the Windows Start menu, open the SAP HANA Studio
Explanation
Screenshot
From the Windows Start menu, run
SAP HANA Studio
7
Connect to a HANA system
Connect the studio to the HANA system we will be using
Explanation
Screenshot
1. In the SAP HANA Administration Console
perspective, right click in the white space
within the Systems view.
2. Select the Add
System...
menu item to execute it.
You can also press s.
3. Enter the host name of the HANA server
you will connect to. Obtain these from
your system administrator.
4. Enter the Instance Number
8
Explanation
Screenshot
5. Click
.
You can also press Alt+n.
6. Enter SYSTEM as the user name
7. Enter your password
9
Explanation
Screenshot
8. Click
to store user name and password
in secure storage (so you won’t have to reenter your credentials when re-connecting
to the system).
9. Click Finish
You can also press Alt+f.
.
10
Open the Streaming Perspectives
If they aren't already, open the SAP HANA Streaming Development and Run-Test perspectives in SAP HANA studio
Explanation
Screenshot
Open the streaming perspectives
1. Go to Windows / Open Perspective and
click the
Other...
menu item to execute it.
You can also press o.
2. Select the SAP HANA Streaming
Development entry by clicking it.
3. Click OK
.
11
Explanation
Screenshot
4. Repeat for the SAP HANA Streaming
Run-Test perspective entry.
12
Connect SAP HANA Studio to the Streaming server
You need to connect your studio to the SAP HANA smart data streaming server that you will be using.
Explanation
Screenshot
Go to the Streaming Run-Test perspective
In the Server view, select the “New Server
URL” tool
Enter the host name and port of the streaming
server you will connect to. Obtain these
from your system administrator.
NOTE: this is the host name and port for the
streaming server, not the HANA database
server.
You should now see the streaming server
listed in the server view. Right click is and
select Connect Server. Enter the
streaming server’s credentials and press
OK.
13
Explanation
Screenshot
Now go to Studio Preferences.
Set the Default Server URL to your server
(format: hostname:port)
Configure HANA Data Service for Streaming connections
SDS Data Services define connections to databases that the SDS projects can use. Here, we need to define an SDS
data service for the HANA system we will be using.
Explanation
Screenshot
1. Click SAP HANA Streaming
Development tab to open this perspective
2. Click the Data Services
tab to
open this view. This is normally tucked
behind the Projects view.
14
Explanation
3. Right click on your connected streaming
host.
Screenshot
xxxxxxxxxxx
4. Click the Add HANA Service menu item to
execute it.
5. Click on newservice1 to select it.
6. In the properties view, enter the Username
and Password corresponding to the HANA
database you wish to connect to and
check “Use Default HANA Server”.
7. Rename the service. Right click on
newservice1 in the Data Services view
and select Rename Service. Change the
name to freezermon_service
15
Explanation
Screenshot
8. To verify that the service is properly
configured, right click again on the
freezermon_service, and select
Discover. When the service correctly
configured, executing the Discover
operation will display the database
schema that is accessible through the
service.
16
Create a new Streaming Project
Explanation
Screenshot
1. Click the File
menu item to execute
it.
You can also press Alt+f.
2. Click the New > >
Project...
menu item to execute it.
You can also press r.
3. Click New Streaming Project.
4. Then Click
You can also press Alt+n.
.
5. Enter freezer_monitoring in the Name
box. (be sure and only use lower case)
6. You can leave the Directory to be your
default workspace or you can change it
7. Click Finish
You can also press Alt+f.
.
17
Create the tables needed for this demo
1. Open the demo file Freezer_Monitoring_Create_Tables_and_LoadData.sql
i. Select File > Open File…
ii. Navigate to the …\ freezer_monitoring_final\data directory and open
Freezer_Monitoring_Create_Tables_and_LoadData.sql
iii. With the .sql file now open, click the Choose Connection button
18
iv. Select the <sid> (SYSTEM) you are working with and click OK
2. Execute the .sql script by pretting the Execute button
3. Confirm that they executed correctly. You should have these tables:
And all will be empty except for MACHIINE_REF which will have 7 rows of data:
Create an input stream
The first thing we'll do to our new project is to configure an input stream
19
Explanation
Screenshot
1. When you created the project, it created an
input stream called NEWSTREAM by
default
2. Click the icon to the left of the name and
change the name of this stream to
MACHINEDATA, then press ENTER
3. Click on the + to the left of Schema to
expand it
5. Now hover the cursor over the right edge of
the MACHINEDATA shape so that the
toolbar appears. Click on the Add
Column tool
6. Repeat step 5 three more times, so there
are a total of 5 columns
20
Explanation
Screenshot
7. Double click on the name Column1 and
change the name of this column to
MACHINEID
8. Double click on (INTEGER) to the right of
MACHINEID to change the datatype for
this column to string.
(after double clicking, click the drop down
arrow, and select string)
9. Change the rest of the column names and
data types as shown:
21
Explanation
26. Click here
Screenshot
to collapse the shape
Now we can just drag the shape to a
new position to organize our diagram
27. Drag the shape
22
Add a filter
Now let's add a very simple "continuous query". In this case, we will use a simple filter. This will filter the raw input
stream into a subset of the incoming events that we are interested in - the events that meet the filter criteria (i.e.
events for which the filter expression evaluates to "true").
Explanation
Screenshot
1. Click Filter
in the Streams and
Windows drawer of the Palette.
2. Click anywhere on the canvas to place the
filter
3. Click the Filter icon and rename the
stream to ACTIVITY_HIST by typing in the
text field.
Confirm your entry by pressing the Enter
key.
4. Click Connector
.
5. Click on MACHINEDATA as the starting
point for the connection
6. Now click on the ACTIIVITY_HIST shape to
complete the connection
23
Explanation
7. Double-click on
expression
Screenshot
to edit the filter
8. Enter
MACHINEDATA.EVENT_NAME='DOOR'
in the text box to define the filter
expression.
Use Ctrl+Space for content assist
Confirm your entry by pressing the Enter
key.
9. Now let's clean up the diagram. Click All
Iconic
to collapse shapes for more
space on the canvas
24
Capture events in HANA table
In the previous step we added a filter to create a stream that just contains door open and close events. We want to
log all the door open/close events in HANA. These represent customer activity (not necessarily purchases) and by
creating a history, we will have the ability to analyze it over time.
To do this, we simply attach a HANA output adapter to the stream we wish to capture, and point it at the HANA table
that will receive the data.
Explanation
Screenshot
1. Click HANA Output
. in the Output Adapters section of the
palette
2. Then click on the canvas to drop it into
the project diagram.
3. Click
to Edit the Adapter Properties
4. Select the freezer_monitoring data
service.
5. Select the entry Target Database Schema
Name
it.
6. Click ...
by clicking
.
25
Explanation
Screenshot
7. Enter the name of the HANA database
schema you will be using in the Value box.
We have created our tables in SYSTEM
so use that.
8. Then click OK
.
9. Enter ACTIVITY_HIST in the Target
Database Table Name text box.
Confirm your entry by pressing the Enter
key.
10. Then click OK
.
11. Now use the connector tool to connect the
adapter to the ACTIVITY_HIST stream.
Select the connector tool, click on the
ACTIIVTY_HIST stream, then click on the
HANA_Output1 adapter.
26
Compile and check for errors
Explanation
1. Click Compile Project
Screenshot
.
2. Check the Problems view to see if the
project compiled without errors
Summary
You have completed the exercise!
You are now able to:
 Create a streaming project
 Create an input stream to receive events
 Create a filter to narrow the event stream down to only the events of interest
 Capture an event stream in a HANA table
27
JOIN EVENT STREAM TO A HANA TABLE, WATCH FOR TRENDS, GENERATE
ALERTS
Overview
Estimated time: 20-30 minutes
Objective
In the following exercises you will learn how to enrich event streams with data from HANA, create aggregation
windows on an event stream to look at trends, and generate alerts under certain conditions.
Exercise Description
 Join an event stream to a HANA database table using a REFERENCE element
 Create a 30 second window to compute a moving average
 Add a filter to generate alerts when moving average falls outside desired range
Add a HANA table to the streaming project
We can access HANA tables (and views) directly in streaming projects. This is useful in a number of ways:


To enrich incoming events with reference data from HANA
To get information from HANA that will control the behavior of the streaming project - for example, change filter
values or change the list of values to match events against
To do this we add a CCL "REFERENCE" to our streaming project. This is a CCL element that points to the HANA
table. When events get joined to the REFERENCE element, the HANA table is queried and the results are used in the
CCL model. The REFERENCE element includes optional caching properties so that repeat queries can be filled from
the cache rather than going back to HANA each time. This includes an aging property to force the cache to be
refreshed.
Note that changes to the HANA table do not stream into the streaming project when they happen, but they will be
picked up the next time the HANA table is queried.
Explanation
Screenshot
1. Navigate to the SAP HANA
Administration Console perspective
Now we are going to select a HANA
table from the Systems view and add
it to our Streaming project
2. Open the catalog for the HANA system in
the HANA Systems view
28
Explanation
Screenshot
3. Open the schema you are using and drag
the table MACHINE_REF onto the canvas
Note: if you don’t see the tables you have
created, try right clicking the schema and
selecting Refresh
4. Drop onto the canvas
5. Click Reference
6. Click Inline
7. Then click OK
8. Click here
and change the name of
the reference element to MACHINE_REF
29
Join the event stream to the HANA table
Now we will join our event stream to the HANA reference table. This is very much like a conventional database
join. The only difference is that this join is continuous: it updates every time a new event is received on the input
stream to the join.
Explanation
1. Click Join
2. Drop it onto the canvas
3. Select the
Screenshot
in the Palette
tool.
Tip: press Shift key while performing the
action. This will keep it selected for
adding multiple connections
4. First click on MACHINEDATA and then
Join1 to connect them
30
Explanation
Screenshot
Now add a connection from
MACHINE_REF to Join1
5. Click Select
to release
the Connector tool (or press Esc)
6. Rename Join1 to EVENTS.
Confirm your entry by pressing the Enter
key
7. Hover over the EVENTS shape so that the
toolbar appears and then click Add
Column Expression
8. Click the Copy Columns from
Input
menu item to execute it.
You can also press c.
31
Explanation
Screenshot
9. Click Select All
.
You can also press Alt+s.
10. Uncheck the 2nd MACHINEID field (we
don't want it twice)
11. Then click OK
.
12. Now set the join condition. Double-click on
.
13. When it prompts you to save the project,
click Yes
32
Explanation
Screenshot
13. We want to join on MACHINEID.
Select MACHINEID in each source column.
Then click Add
14. Click OK
33
Add an aggregation window
Until now we've been working with streams, which are stateless. But you really can't get much information from a
single event. If you want to look at trends or patterns, or correlate different events, you first need to create a window
that will hold some recent events.
Here we will create a window, group events in the window by Machine ID, and then compute an average temperature
for the group. We're going to set a time-based retention policy for this window, in this case 30 seconds. So our
average becomes a moving average: it will get updated every time a new event arrives in the group and will also get
updated as each event ages out of the group.
While we will use a time-based window in this exercise, note that other types of windows are supported. Instead of
time we could set a size policy - KEEP 100 rows, for example.
The other thing we will do here is apply a filter to the events included in each group. In this case we are only interested
in temperature events.
Next, in just a few steps, we will create a 30 second moving average of the temperature on each machine.
Explanation
Screenshot
1. Click Aggregate
2. Drop onto canvas
3. Rename the stream to AVG_TEMP
Confirm your entry by pressing the Enter
key
4. Click Connector
5. Click on the EVENTS stream
6. Click on AVG_TEMP
34
Explanation
Screenshot
7. Click Add Column Expression
8. Click the Copy Columns from
Input
menu item to execute it.
You can also press c.
9. Click Select All
You can also press Alt+s.
10. Uncheck
EVENTS.EVENT_NAME
.
.
11. Uncheck
EVENTS.EVENT_DESCRIPTION
.
12. Uncheck
EVENTS.MACHINETYPE
.
13. Click OK
.
35
Explanation
Screenshot
14. Now we will create a window on the input
to this aggregation.
Click on
button.
with the right mouse
15. Click the Keep
Policy
menu item to execute it.
You can also press k.
16. Click Time
.
17. Enter 30 seconds in the entry box.
18. Click OK
.
36
Explanation
Screenshot
19. To define the GROUP BY clause, expand
the tab
.
20. Double-click on
.
21. Select the entry EVENTS.MACHINEID by
clicking it.
22. Click Add >>
.
23. Click OK
.
24. Now we need to add a GROUP filter,
since we only want to aggregate
temperature readings.
Click Add Group Clause
.
37
Explanation
Screenshot
25. Click the Group Filter
Clause
menu item to execute it.
You can also press g.
26. Double-click on
.
27. Enter EVENTS.EVENT_NAME='TEMP'
as the filter expression in the text box.
Note that a GROUP FILTER filters the
incoming events before aggregation.
Use Ctrl+Space for content assist.
Confirm your entry by pressing the Enter
key.
28. To edit the column expressions, expand
the tab
29. Double-click on
30. Edit the expression for EVENT_TIME.
Change it to:
last(EVENTS.EVENT_TIME)
This will cause the aggregate values for the
group to show the event time of the last
event received in the group.
Confirm your entry by pressing the Enter
key
38
Explanation
Screenshot
31. Double click the name EVENT_VALUE
and rename this column to AVG_TEMP by
typing in the text field.
Confirm your entry by pressing the Enter
key
32. Double-click on the expression for
AVG_TEMP, which is currently set
to:
33. Edit this expression to compute an
average. Also, since the value field is a
string, before we can compute an
average, we need to convert it to a
number.
Change the expression to:
avg(to_decimal(EVENTS.EVENT_VALUE, 4,
2))
Confirm your entry by pressing the Enter key.
39
Generate Alerts
Now for the final step in this section: we will add a stream that produces alerts. While sometimes the logic to produce
an alert could be complex, here it's very simple and we can do it with a simple filter. We've already enriched the event
stream with the maximum acceptable temperature for each machine (obtained from HANA) and computed a moving
average of the actual temperature of each machine (to smooth out brief variations caused by an open door), so now
we can simply apply a filter to watch for machines where the current average temperature is greater than the
maximum allowed for that machine.
One difference here is that we are not going to use the Filter tool from the palette. Instead we will use the "Derived
Window" tool. The reason for this is that we want to add some columns to the output. The filter tool doesn't let us do
that.
The Derived Window and Derived Stream tools expose the full power of CCL. The other tools are just there for
simplicity, but everything they can do, you can do in a derived stream or window.
Explanation
Screenshot
1. Click Derived
Window
Palette
2. Drop it onto the canvas
in the
3. Add a connector from AVG_TEMP to the
new window
4. Change the name to ALARM_TEMP
Confirm your entry by pressing the Enter
key
40
Explanation
Screenshot
5. Click Add Column Expression
6. Click the Copy Columns from
Input
menu item to execute it.
You can also press c.
7. Select all except for MIN_TEMP and
TEMP_UNIT
8. Click OK
.
41
Explanation
Screenshot
9. Click Add Column Expression
10. Add a column to this window, and then
repeat (ie. add 2 new columns total)
Click the Column
Expression
menu item to execute it.
You can also press c.
11. Rename the first new column
to ALARM_TYPE
Confirm your entry by pressing the Enter key
12. Rename the 2nd new column
to ALARM_DESC
Confirm your entry by pressing the Enter key
13. Double-click on
to edit the
expression.
42
Explanation
Screenshot
14. Enter 'TEMPERATURE' in the expression
edit box for the ALARM_TYPE column.
This will set the "type" of all alarms emitted by
this window to the string
"TEMPERATURE"
Confirm your entry by pressing the Enter
key
15. Enter 'Machine not maintaining
temperature' in the expression box for the
ALARM_DESC column
Confirm your entry by pressing the Enter
key
16. Now click Add Query Clause
to add a
query clause. We want to add a filter such
that this window only contains rows for
machines that have a current average
temperature above the max specified for
the machine
17. Click the
Filter
menu item to execute it.
You can also press f
43
Explanation
18. Double-click on
expression
Screenshot
to edit the filter
19. Change the filter expression to:
AVG_TEMP.AVG_TEMP >
AVG_TEMP.MAX_TEMP
Use Ctrl+Space for completion assist
20. Click Compile Project
errors
and check for
You can ignore these warnings
about not having a retention policy
on these windows. In some cases,
the lack of a retention policy can
cause the window to grow
unbounded. Thus the warning. Here
we are "safe" however, since
AVG_TEMP is an aggregation over
an unnamed window that has a
retention policy, and the ALARM
window is fed by the aggregate
44
Summary
You have completed the exercise!
You are now able to:
 Join event streams to HANA tables in order to do things like enrich the raw event data, or filter or analyze the
data based on context or historical information from HANA
 Create an aggregation over a time-based sliding event window to monitor trends - in this case computing a
moving average
 Generate alerts when certain conditions are detected. Here we compare the current moving average for each
machine to the maximum allowed for that machine (as specified in the HANA database), and generate an alert
when the current value exceeds the maximum
45
RUN AND TEST THE PROJECT
Overview
Estimated time: 15 minutes
Objective
In the following exercises you will learn how to use a few of the most frequently used tools in the Run-Test perspective
to run the project, stream some data through it, and view the results.
Exercise Description
 Run the project that you built in the previous exercises. Here we are running the project locally on the "sandbox
server" that the Studio provides for testing
 Since we don't have a live data source we will use the playback tool to stream in data from a file, at a controlled
rate, to simulate a live source
 Use the Stream Viewer to view the output from each stream/window in our project
 Use the Manual Input tool to manually generate test events and send them into our project one at a time - very
useful for testing
 Use the Event Tracer tool to see how an event flows through the directed graph within the project
Run the project, play in some data, view results
Here we will run the project we just created and we'll use some of the studio tools to test it and view the output. We
don't have a live stream of data to connect to, so we will use the playback tool to simulate an event stream, reading a
set of events from a file. And we will use the stream viewer to view the results of each operation.
Explanation
Screenshot
1. Click the drop down arrow next to the Run
button and select the streaming server to
run this project.
When you click the Run button to
start the project, the Run-Test
perspective becomes the active
perspective
The Server View shows the running
projects and the (visible)
streams/windows in each project
2. Double-click on MACHINEDATA to open it
in the Stream View
Note: You won't see any data, because we
haven't loaded any data yet.
46
Explanation
Screenshot
3. Now double-click on each of the other
streams/windows to open them in the
Stream View tool
5. Click the Playback
it
tab to select
6. Click Select Project
in the top right
corner of the Playback window to connect
the playback tool to the current project (if
you had multiple projects running it would
ask you to choose)
7. Click Select Playback File
the data file to use
to select
47
Explanation
Screenshot
8. Change the file type to .csv
9. Choose machinedata.csv in the data
folder of the freezer_monitoring_final
project
10. Then click Open
You can also press Alt+o
11. Click rec/ms
We want to control the playback speed so that
we can watch things happen at a
reasonable pace
Once it's running we can speed it up or slow it
down using the slider tool
12. Enter 0.005 in the rec/ms box. This is a
good rate to start with
13. Then click the Start Playback button
14. Click each viewer tab to view the output
from each stream/window
48
Use manual input and event tracer tools
Now we'll use two more of the tools in the run-test perspective, both of which are very useful for testing your project.
Explanation
Screenshot
1. Click the Event Tracer
tab to select it
2. Click here
to connect the Event
Tracer tool to the running project.
If you had multiple projects running, it would
ask you to choose which one
3. Click the Manual Input
select it.
tab to
4. Click Select Stream
to connect the
manual input tool to the project
5. Click MACHINEDATA
6. Then click OK
49
Explanation
Screenshot
7. Fill out input fields as shown in the picture
Note: be sure to use "A" for the MACHINEID,
since it exists in the HANA reference
table. You can alter the other values
8. Then click Publish
to send the event
9. The colors in the Event Tracer (at the top
of studio) diagram shows how each
stream/window in the project is affected by
the event you just sent.
Double-click on each node to view the output
generated by that node in response to the
event in the Console
10. Click the Stream
View
tab to select it.
50
Explanation
Screenshot
11. Scroll down to see the event you entered
Summary
You have completed the exercise!
You are now able to:
 Run a project locally to test it
 Use the Stream View tool to view/watch the contents of streams and windows
 Use the Playback tool to simulate a live data source by streaming in data from a file (note that you can use this
tool to record a live stream and then play it back)
 Use the Manual Input tool to send custom events. This is very useful for testing
 Use the Event Tracer tool to see how an individual event flows through the project and how it affects each
processing node (step) in the project
51
WATCH FOR PATTERNS OF EVENTS; USE THE CCL EDITOR
Overview
Estimated time: 20 minutes
Objective
In the following exercise you will learn how to add a Pattern stream, which uses the CCL MATCHING clause, to watch
for a specific pattern of events. In this example we will be watching for a missing event. We will use this to generate an
alert if we receive a "Power Off" event that is not followed by a "Power On" event for the same machine, within a
defined time interval.
We will also take this opportunity to use the CCL editor. Up until now, we have been working exclusively with the
Visual Editor. The visual editor is good for users who don't know the syntax of CCL. Even experienced users like the
visual editor for quickly laying out the processing flow of a new application. But you can also edit projects directly in
the CCL editor - and you can switch back and forth; changes made in one will be reflected in the other.
Exercise Description
 Add a pattern stream to the project
 Use the CCL editor to configure the pattern stream
 Define the pattern to watch for, and the contents of the event to produce when the pattern is detected
Add a Pattern stream
One of the SQL extensions in CCL is the MATCHING clause that lets you watch for specific patterns of events. You
can even watch for missing events. The Palette includes a tool called "Pattern" that will create a derived stream using
the CCL MATCHING clause, but here we will also make use of the CCL editor to edit the underlying CCL.
So far we have been working exclusively in the Visual Editor. The visual editor is a great place to start - especially for
someone that doesn't know CCL. But you can also work anytime - or all the time - in the CCL editor. Any changes
you make to the project in one editor will be reflected in the other.
Note, however, that you should normally only open the project in one editor at a time. If you have the project open in
one editor and then open it in the other editor, the 2nd copy will be read-only.
Explanation
Screenshot
1. Click SAP HANA Streaming
Development to switch to the perspective
52
Explanation
Screenshot
2. Click Pattern
3. Drop it onto the canvas
4. Click
5. First click on EVENTS and then the new
Pattern1 stream to connect them
6. Rename the stream to ALARM_POWER.
Confirm your entry by pressing the Enter
key
53
Explanation
Screenshot
7. Click beside the pattern image in the
header of the object to select it (where 7
points to)
8. Click Switch to Text
switch to the CCL editor
or press F6 to
9. Click Yes
You can also press Alt+y
You are now in the CCL editor.
Notice that because the
ALARM_POWER stream was
selected when you left the visual
editor, the cursor is now positioned
at the beginning of the
ALARM_POWER stream definition
We can reuse some of the existing
code from the ALARM_TEMP
element to speed up the writing of
the ALARM_POWER functionality.
The Outline tool lets us quickly
navigate to the ALARM_TEMP
element.
10. Click Outline
to open the outline
54
Explanation
Screenshot
11. Click ALARM_TEMP
12. Click Outline
to close it
As you can see, we have used the
Outline as a quick way to navigate
to the ALARM_TEMP window
55
Explanation
Screenshot
13. Select the column expressions from the
ALARM_TEMP window and copy them
14. Now paste the column expressions into
the ALARM_POWER stream, replacing
the "*"
15. Edit the FROM clause to read EVENTS A,
EVENTS B. Since we want to watch for a
pattern of 2 events from the same stream,
we need define two different aliases for
the stream
16. Now edit the column expressions to look
like this.
1. Change the source of each column value
coming from the input stream to "A"
2. Delete the AVG_TEMP and MAX_TEMP
columns
3. Edit the text for the ALARM_TYPE and
ALARM_DESC columns.
17. When finished editing, click Visual
Editor
or press F6 to switch back to
the visual editor
56
Explanation
Screenshot
18. Click Add Pattern
to define the
pattern we want to watch for
We could have done this in the CCL editor,
but this is easier if you don't remember the
syntax.
After you're finished, you can go back to the
CCL editor to look at the CCL that was
created
19. Enter 20 sec in the Interval box.
Note: we will choose a very short
interval just for the purpose of this
demo - so that we don't have to wait
long to see an alarm
20. Enter the pattern we want to watch for: A,
!B
The "," means "followed by" and "!" means
not.
So here we want to watch for event A that is
NOT followed by event B within 20
seconds.
We could also use AND and OR in the pattern
21. Now we need to define our ON
clause. This filters the incoming events to
determine which events qualify as an "A"
event and a "B" event. Enter the
expression:
A.MACHINEID = B.MACHINEID AND
A.EVENT_VALUE = 'Power off' AND
B.EVENT_VALUE = 'Power on'
Tip: you can use Ctrl+Space for completion
assist
22. When finished editing the ON clause, click
OK
23. Click Compile Project
errors
.
to check for
Summary (2)
You have completed the exercise!
You are now able to:
 Use a "Pattern Stream" - i.e. the CCL MATCHING clause - to watch for a specific pattern of events, producing a
new output event each time the pattern is detected
57

Use the CCL editor as an efficient way of applying complex syntax or as a way to use copy and paste to avoid
repetition. It's also useful to use the CCL editor to see the details and it's often easier to identify the cause of a
compiler error within the CCL editor
58
ADVANCED CCL: CUSTOM FLEX OPERATORS
Overview
Estimated time: 20-45 minutes
Objective
In the following exercises you will learn how to use Flex Operators with custom event processing logic written in
CCLScript (previously called SPLASH in ESP SP08 and earlier). This provides tremendous flexibility in processing
events. This is an advanced technique, however, and does require basic high level programming skills (similar to
writing SQLscript).
Exercise Description
 Add a Flex Operator to the project
 Use the Visual Editor to configure the Flex Operator
 Add a DECLARE block to define data structures
 Use CCLScript to write an event handler (method)
 Use the CCL editor to edit a Flex Operator
 See how to match/correlate multiple events to produce a single output event
 See how to use incoming events to update status information
PowerOutage Flex
A Flex operator contains one or more custom event handlers written in CCLScript. This offers more functionality than
just the standard relational operators of SQL.
Here we are going to match pairs of incoming events to produce a single output event. More specifically, we want to
record power outages, where each outage produces a single record with the time and duration of the power
outage. This is a little different from the Pattern stream we created in the last exercise, in which we produced an alert
as soon as the power has been off for more than a certain amount of time. Here, we wait for the power to come back
on, and then produce an output event that has the start time and duration of the outage.
To accomplish this, we will use this basic logic:
1. Create a dictionary that will hold the last "Power Off" time for each machine
2. When we get a power-off event, update the dictionary
3. When we get a power-on event, look up the power-off time in the dictionary and produce an output event
Explanation
Screenshot
1. Click Flex
2. And then drop it onto the canvas
59
Explanation
Screenshot
3. Connect the EVENTS stream to the new
Flex operator
4. Change the name to POWER_OUTAGES
Confirm your entry by pressing the Enter key
5. Click Add Column
columns
4 times to add 4
6. Change the column names and data
types to match the screenshot provided.
Double click on a column name to edit it.
To change the data type, click on the type and
then the drop down arrow to select the
desired type
7. Click the Edit Local
Declaration(s)
menu item to execute it
You can also press e
60
Explanation
Screenshot
8. Enter the declarations as shown in the
screenshot:
DECLARE
DICTIONARY ( string , msdate ) offtime ;
TYPEOF ( POWER_OUTAGES ) outrec ;
msdate offts ;
END
9. Then, after editing the declarations, click
OK.
.
10. Double-click on
.
61
Explanation
Screenshot
11. Enter the CCLScript shown here
Note: The msdate variable type is a
timestamp with millisecond precision. The
default format is YYYY-MMDDTHH:MM:SS:SSS. When an msdate is
subtracted from another msdate, the
resulting value is of type interval. The
interval data type is in microseconds,
hence the division by 60000000
12. Click
OK
ON EVENTS {
if (EVENTS.EVENT_VALUE = 'Power off') {
offtime[EVENTS.MACHINEID] :=
EVENTS.EVENT_TIME;
}
if (EVENTS.EVENT_VALUE = 'Power on' AND
not isnull(offtime[EVENTS.MACHINEID])) {
offts := offtime[EVENTS.MACHINEID];
outrec.MACHINEID :=
EVENTS.MACHINEID;
outrec.POWER_OFF_TIME := offts;
outrec.POWER_ON_TIME :=
EVENTS.EVENT_TIME;
outrec.DURATION_MIN :=
cast(double,EVENTS.EVENT_TIME offts)/60000000;
output setOpcode(outrec,insert);
}
} ;
13. Click the POWER_OUTAGES stream to
select it so when we switch to the CCL
editor, the cursor will be in the right place
14. Click Switch to Text
switch to the CCL editor
or press F6 to
62
Explanation
Screenshot
15. Change the CREATE FLEX
POWER_OUTAGES statement to
produce an OUTPUT STREAM instead of
an OUTPUT WINDOW (there's no reason
to collect them here - we are going to
record them in HANA)
16. Click Compile Streaming Project
(F7)
to check for errors
Notes on the CCLScript on the PowerOutage Flex
1. The DECLARE block is local to this Flex operator. Global DECLARE blocks are also supported and would be put at
the top of the CCL file; outside of any CREATE... statement.


"offtime" is a dictionary that saves the time of the last Power Off event for each machine. It will be
indexed by MACHINEID, which is a string. Dictionaries are key/value pairs, where both the key and the
value can be a simple type (a primitive such as an integer, string or character) or a complex type (an
object which may house many simple or complex data types).
"outrec" is a temporary data record structure that matches the schema of the output stream being
created by this Flex operator.
2. The ON EVENTS method is executed every time an event is received from the EVENTS stream


First, we check the EVENT_VALUE field of the incoming event to see if it's a POWER_OFF event. If it is,
we save the time of the Power Off event for this MACHINEID in the dictionary.
Then, we check to see if the incoming event is a Power On event. If it is, and if there is a Power Off time
for this machine in the dictionary, then we construct and publish an output event.
 Anytime we publish an event from a Flex we have to explicitly set the OpCode of the event being
produced – thus, the use of the setOpcode function.
Dashboard Flex
Now we are doing something a bit different:
We are going to build another Flex operator. This flex produces what we are calling the "Dashboard" which has
current status information for each Machine. The challenge is that we are receiving different types of events on the
same stream, with different uses of the "value" field. This is a common problem. But it prevents us from doing a simple
aggregation or update, since one event can have a temperature reading in the value field and another event on the
same stream can have "Power off" in the value field.
So we use CCLScript to create a new window called DASHBOARD that holds a single summary row for each
machine. With each incoming event, the script examines the event and updates the relevant summary row.
1. The local declare block creates a dictionary that holds the most recently received information for each machine.
This dictionary starts out empty, but information is added/updated as events are received.
2. The ON EVENTS method is invoked for each incoming event from the EVENTS stream
 The method gets the previous set of information for the specific machine from the dictionary (if an entry for
this machine exists in the dictionary)
 It then updates the information for this machine using the data contained in the event
 Finally, it publishes an updated summary row for this machine and updates the dictionary with the current
information
63
Explanation
Screenshot
We will complete this entire exercise
using the CCL editor
1. Click Switch to Text
or press F6
The CCL shown in the screenshot is
provided as text below.
Just a reminder, if you are typing it,
that you can use Ctrl+Space for
completion assistance
Now go ahead, compile the project
to check for errors, and then click the
run button to check things out in the
run-test perspective
CCL for the Dashboard Flex
CREATE FLEX DASHBOARD
IN EVENTS
OUT OUTPUT WINDOW DASHBOARD
SCHEMA(
MACHINEID string,
POWER_STATUS string,
CURR_TEMP double )
PRIMARY KEY (MACHINEID)
BEGIN
DECLARE
typeof(DASHBOARD) outrec;
dictionary(string, typeof (DASHBOARD)) prev;
END;
ON EVENTS{
if (not isnull(prev[EVENTS.MACHINEID])){
outrec := prev[EVENTS.MACHINEID];
}
outrec.MACHINEID := EVENTS.MACHINEID;
if(EVENTS.EVENT_NAME = 'POWER') {
outrec.POWER_STATUS := EVENTS.EVENT_VALUE;
}
if (EVENTS.EVENT_NAME = 'TEMP') {
outrec.CURR_TEMP := to_decimal(EVENTS.EVENT_VALUE, 4, 2);
}
output setOpcode(outrec,upsert);
prev[EVENTS.MACHINEID] := outrec;
};
END;
64
Extra: attach outputs to HANA tables
This is a "work on your own" exercise, applying what you have learned in the previous exercises.
Add a HANA output adapter to either POWER_OUTAGE or DASHBOARD, or both, and attach them to a HANA table
to capture the output from either/both of these in HANA tables.
Hint: Review the exercise in the very first chapter that showed how to add a HANA output adapter to capture events in
HANA.
A couple of points to note:
1. The POWER_OUTAGE flex produces an output stream that contains only "insert" events - thus every power outage
record will be added (appended) to the table in HANA.
2. The DASHBOARD flex will behave differently. It produces a window and events are add to the window as "upserts"
(update the row if this key value exists, otherwise insert the row). When this is attached to a HANA table, it will update
the rows in the HANA table. Thus, the HANA table will always "mirror" the DASHBOARD window in the streaming
project.
Summary (3)
You have completed the exercise!
You are now able to:
 Create custom operators using CCLScript
 Use HANA output adapters to update HANA tables so that the HANA table always has current values of
summary data. This is different from the more common use case where we are simply inserting events into
HANA tables to create a history of events
65
APPENDIX – CREATING A HANA USER AND GRANTING SMART DATA
STREAMING PERMISSIONS
Creating a New User
1. Navigate to the SAP HANA Administration Console perspective in HANA Studio
2. Expand the Security folder within your SYSTEM user
3. Right click Users and select “New User”
4. In the <sid> - New User window that opens, enter a User Name and a Password (and confirm the
password)
Note: The password chosen in this window will need to be changed the first time the user logs in.
5. Finally, we will grant the necessary Roles for the new streaming user to complete the Smart Data
Streaming – Freezer Monitoring Tutorial.
a. First, grant roles by selecting the Granted Roles tab and clicking the green plus button.
66
b. Select all roles appearing in the screenshot below. To select multiple roles, hold the control
button when selecting.
6. Press the Execute button in the top right to create the new user.
7. Alternatively, the Cluster Administrative Tool for SAP HANA Smart Data Streaming can be used to
administer permissions to users. When you install Smart Data Streaming as part of your HANA system,
the HANA SYSTEM user is assigned all of the required permissions to administer the Streaming servers
including deploying and running projects. While using the SYSTEM user is fine for getting started and
possibly even for your early streaming project development, as a best practice you will want to use a
different user for production systems. Permissions specific administering the smart data streaming
servers are managed through the Cluster Administrative Tool ("streamingclusteradmin"). Detailed
documentation on the functionality of the Cluster Administrative Tool is covered in the Cluster
Administrative Tool section of the SAP HANA Smart Data Streaming: Configuration and Administration
Guide.
http://help.sap.com/saphelp_hana_options_sds_conf/helpdata/en/e7/9ba4ba6f0f101494e384af0fcbab
b7/frameset.htm
The purpose of this blog post is to provide a few tips that may help you out if this is the first time you are
using the Cluster Administrative Tool.
a) You need to run the streamingclusteradmin tool as the <sid>adm Linux user. For example if your
HANA System ID ("SID") is "HA0" then you would need to log on to the Streaming node of your
HANA system as the "ha0adm" user.
67
b) Assuming that you use the default path of "/hana/..." for the shared storage location of your HANA
system, then the streamingclusteradmin utility is located in the
/hana/shared/<SID>/streaming/STREAMING-1_0/bin directory. If your <SID> is "HA0", then the path
would be: "/hana/shared/HA0/streaming/STREAMING-1_0/bin"
c) You should use the HANA SYSTEM user to connect to the streamingclusteradmin tool. For example:
./streamingclusteradmin --uri=esps:// <fully.qualified.domain.name>:3XX26 --username=SYSTEM
--password=<SYSTEM user password>
d) When specifying the address for the URI parameter:
You need to provide the machine name of the Streaming node, not the HANA master node. For
example if the HANA master node is installed on machine "hanaserver.mydomain.com" and the
Streaming server is running on the node "streamingnode.mydomain.com", then the URI parameter
would be: "--uri=esps://streamingnode.mydomain.com:3XX26"
The port # will have the format 3XX26 where XX = the Instance # of your HANA system. If your HANA
system is Instance # 00, then the port # will be 30026. If the HANA Instance # is 11, then the port #
will be 31126.
e) The "grant perm <priv> [<privtype>] [on [any] <resourcetype> [<resource>]] to user|role <name>"
command grants the specified permission, optionally restricted to a specific type of element, a
specific type of resource, or a specific resource, to the specified user or role.
 <priv> is one of: read, view, write, add, remove, start, stop, control, execute,admin, or all
 <privtype> is one
of: stream, adapter, project, application, dataservice,workspace, node, cluster, service, system,
or all
 <resourcetype> is one
of: stream, adapter, project, application, dataservice,workspace, node, cluster, service, system,
or all
 <resource> is the name of a specific instance of the specified resource type
 <name> is the name of the user or role to which you are granting this permission
For example, to grant the user developer1 permission to perform all actions, but only in workspace
w1, enter:
--command "grant perm all on workspace w1 to user developer1"
To grant the role manager permission to read all streams in any workspace, enter:
--command "grant perm read stream on any workspace to role manager"
For more detailed information regarding streamingclusteradmin, consult this documentation.
© 2015 SAP SE or an SAP affiliate company. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP SE or an SAP
affiliate company. SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered
trademarks of SAP SE (or an SAP affiliate company) in Germany and other countries.
Please see http://www.sap.com/corporate-en/legal/copyright/index.epx#trademark for additional trademark information and notices.
68