Azure Sentinel Data Connector for Stream


Microsoft Azure provides several different mechanisms to send data to Sentinel (more information).  Vectra AI leverages the Syslog data connector to send Vectra Stream metadata to Sentinel.  This solution requires the Microsoft syslog agent be installed in a separate Linux instance (Cloud or on-premise).  This separate instance will run a syslog server (rsyslog) to receive the network metadata from Vectra Cognito Stream.  It will then forward it locally to the Operations Management Suite (OMS) Agent.  Once received by the agent, it will be forwarded on to the configured Log Analytic workspace within Azure.

In Sentinel, the data coming from this connector is stored in the Syslog table.  This table is minimalistic and contains only 8 different properties.  Among them, SyslogMessage is the property which will contain the entire Syslog message in raw format.  To ease the work of building queries and searching through the different network metadata, Vectra AI provides a set of Kusto functions (parsers) to parse the different attributes available for each metadata type.  These functions can be found in the Azure Sentinel Github repository and need to be installed once the connector is setup.  Kusto query Language has built-in functions to help parse JSON data in a very efficient way.  With that in mind, we want the Stream metadata to be sent in JSON format.

To get the data in Sentinel, this requires it to be sent in Syslog format to the OMS agent.  Currently, Vectra Cognito Stream is able to send the JSON metadata in RAW format over TCP (meaning it does not have any syslog header, it is only the payload).  This format is not accepted by the Microsoft OMS agent and data is discarded.  To work around this, we are going to change the rsyslog configuration to add a syslog header to the JSON data it receives before sending it on to the OMS agent.

!! We strongly recommend that the Linux instance be deployed as an on-premise solution
as the data between Vectra Cognito Stream and the Microsoft agent is not encrypted.



Installation and Configuration

OMS Agent Installation

The OMS agent can be deployed with syslog-ng and rsyslog.  The instructions provided by Vectra for this solution are designed for working with rsyslog.  Before installing the agent, make sure that only rsyslog is installed.  To validate which package is currently installed in the Linux instance, run:

# dpkg -l | grep syslog
ii      rsyslog      8.2001.0-1ubuntu1.1.     amd64.     reliable system and kernel logging daemon

The output above shows only rsyslog which is what we want (this is the default with Ubuntu for example).  If you also see syslog-ng, please remove the syslog-ng package first and make sure rsyslog is installed.  During the installation of the OMS agent, it will check automatically which syslog software is installed and configure it accordingly.

One rsyslog is the only syslog installation on the system, install the OMS agent on the system.  Please follow the installation steps described in the Vectra AI Stream Connector available in Azure Sentinel (link to be added when available)

Configuration of rsyslog

As root, create a new configuration file for rsyslog:

vim /etc/rsyslog.d/20-vectra-stream.conf

Paste this configuration:

template(name="stream_custom" type="string" string="<13>%TIMESTAMP% %HOSTNAME% stream:%rawmsg%\n")
##### Send to Azure Sentinel through OMS Agent
action(type="omfwd" template="stream_custom" target="" port="25224" protocol="udp")
# provides TCP syslog reception
input(type="imtcp" port="9009" ruleset="stream")
!! The destination port in this configuration file is set to 9009, you can change it but
it must match what you configure in the Vectra UI for Stream configuration (Settings > Cognito Stream).

Restart the rsyslog service:

systemctl restart rsyslog

Check the status:

# systemctl status rsyslog
rsyslog.service - System Logging Service
Loaded: loaded (/lib/systemd/system/rsyslog.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2021-05-14 23:41:48 UTC; 3s ago
TriggeredBy: ● syslog.socket
Docs: man:rsyslogd(8)
Main PID: 1587580 (rsyslogd)
Tasks: 10 (limit: 9449)
Memory: 1.6M
CGroup: /system.slice/rsyslog.service
└─1587580 /usr/sbin/rsyslogd -n -iNONE

If the service is not running and you see some error messages, reach out to for assistance.

You can also validate that the machine is now listening on port 9009:

# netstat -antp | grep 9009
tcp 0 0* LISTEN 1587580/rsyslogd

Validate the Setup

Once it is done, we can validate that the network metadata is received into Sentinel.  You can use the query:

| where ProcessName == 'stream'

View for Azure Sentinel Console (Logs menu):


Query and Manipulate Network Metadata

Syslog Table

As described in the introduction, data from Vectra Stream is stored in the Syslog table which contains only these properties: 

Property Description
Computer Computer that the event was collected from.
Facility Defines the part of the system that generated the message.
HostIP IP address of the system sending the message.
HostName Name of the system sending the message.
SeverityLevel Severity level of the event.
SyslogMessage Text of the message.
ProcessID ID of the process that generated the message.
EventTime Date and time that the event was generated.


The payload of the syslog message containing all the metadata is stored in the SyslogMessage attribute.


Example with iSession metadata from the above screenshot:


As you can see, nothing is parsed by default but luckily we have the data in a structured format (JSON).  To be able to query the data more easily we are going to add some custom parsers (KQL functions)

Kusto Functions

Kusto functions, once installed in your Azure Sentinel environment, will parse all the attributes for each metadata type provided by Vectra AI.  There is one function per metadata type which are available in Microsoft Sentinel GitHub repository (LINK TO BE ADDED ONCE AVAILABLE).  To install those functions,  simply do the the following for each one of them:

  • Open a new query
  • Copy the content of a function from the GitHub repository and paste it in the New Query window
  • Click Save > Save as function
  • Use a name which is easy to remember and type
  • Click Save

Example with SSH function:


Once it is done, open a new query tab, Select Functions > Workspace functions.  You should see your newly saved function:


When you click on the function name, you can choose to use it in the editor (you can also simply type manually the function name in the query window).


Build your query:


Once your query is done, simply run it:


Was this article helpful?
0 out of 0 found this helpful

Download PDF


Article is closed for comments.