This
Use the following menu options to navigate the SL1 user interface:
- To view a pop-out list of menu options, click the menu icon ().
- To view a page containing all of the menu options, click the Advanced menu icon ().
Adding Supported Data Models
To publish data from the data streams, you must create a YAML file for each data stream you want to publish. The supported data models, as well as the format for the YAML files, are covered in this section.
Supported Data Models
Currently, Publisher supports two data streams, specified in a subscription as dataModels:
- Availability. Publisher stream for availability data from SL1.
- Interface. Publisher stream for interface performance data from SL1.
- Dynamic Application Performance. Publisher stream for Dynamic Application performance data from SL1.
Required Fields for Data Model File
The following fields must be specified in your Data Model YAML file:
name
. Specifies the unique name of the data model (for example, "availability"). Subscriptions will specify the data model to listen for, by name.address
. Specifies the Kafka topic from which to extract data.config
. This section lets you define Kafka connection options for the source and the sink.source
. Specifies the configuration for the Kafka topic being consumed. This is the Kafka Broker containing the topic defined by the address field.sink
. Specifies the configuration for the Kafka topic to which you will publish the data. This is the internal Kafka Broker containing the topics for subscriptions created by Publisher.
Template for Data Model File
To add a Data Model, create one in YAML using the DataModel CRD. This is a template for creating the file:
apiVersion: publisher.sl1.io/v1alpha1
kind: DataModel
metadata:
name: <data model name>
spec:
address: <kafka "url" to receive schema registry models from. ex: kafka://broker-address:port/topic_name>
config:
sink: # Any needed Kafka Client config for publishing messages
<kafka_config_variable>: <config value>
source: # Any needed Kafka Client config for consuming messages
<kafka_config_variable>: <config value>
Example
The following example shows a YAML file for publishing availability data:
apiVersion: publisher.sl1.io/v1alpha1
kind: DataModel
metadata:
name: availability
spec:
address: kafka://kafka.kafka.svc.cluster.local:9092/avail.data
This example shows a YAML file for publishing interface data and adding a configuration when reading from the source topic:
apiVersion: publisher.sl1.io/v1alpha1
kind: DataModel
metadata:
name: interface
spec:
address: kafka://kafka.kafka.svc.cluster.local:9092/interface.data
config:
source:
retry_backoff_ms: 100
This example shows a YAML file for publishing Dynamic Application performance data:
apiVersion: publisher.sl1.io/v1alpha1
kind: DataModel
metadata:
name: dynamic-app
spec:
address: kafka://kafka.kafka.svc.cluster.local:9092/da.prod.perf.pres.data
Applying the Data Model File
After creating the Data Model YAML file, you must apply it.
To apply the Data Model file:
- Either go to the console of the Management Node or use SSH to access the Management Node. Open a shell session on the server. Log in with the system password.
-
At the shell prompt, enter the following command, substituting the name of your Data Model file:
kubectl apply -f <data_model_file>
Adding a Subscription
To subscribe to data from the data streams, you must create a Subscription YAML file. You can subscribe to any of the supported data models you have created (see Adding Supported Data Models). The format for the YAML file is covered in this section.
Required Fields for Subscription File
The following fields must be specified in your Subscription YAML file:
dataModels
. Contains a list of data models from which to retrieve data, by name.address
. Defines the Kafka Broker and Topic to which to publish the data.config
. This section lets you define Kafka connection options for the source and the sink.source
. Specifies the Kafka Topic being consumed. This is the internal Kafka Broker containing the topic subscriptions created by Publisher.sink
. Specifies the Kafka Topic to which you will publish the data. This is the external Kafka Broker defined by the address field.
Template for Subscription File
To add a Subscription, create one in YAML using the Subscription CRD. This is a template for creating the file:
apiVersion: publisher.sl1.io/v1alpha1
kind: Subscription
metadata:
name: <subscription name>
spec:
address: <destination; possible values: kafka://broker_address/topic_name >
dataModels:
- <datamodels to listen for>
- <possible are the names of the dataModels>
config:
sink: # Any needed Kafka Client config for publishing messages
kafka_config_variable: <config value>
source: # Any needed Kafka Client config for consuming messages
kafka_config_variable: <config value>
Example
This example shows a YAML file for subscribing to availability and interface data:
apiVersion: publisher.sl1.io/v1alpha1
kind: Subscription
metadata:
name: data-lake
spec:
address: kafka://192.10.14.37:9092/sl1_sub_topic
dataModels:
- availability
- interface
Applying the Subscription File
After creating the Subscription YAML file, you must apply it.
To apply the Subscription file:
- Either go to the console of the Management Node or use SSH to access the Management Node. Open a shell session on the server. Log in with the system password.
-
At the shell prompt, enter the following command, substituting the name of your Subscription file:
kubectl apply -f <subscription_file>
Configuring an Authenticated Connection for Subscriptions
This section describes how to configure authentication for your subscriptions. Currently, only the Secure Sockets Layer (SSL) authentication method is supported.
Certificate Files
To configure Publisher for SSL communication with Kafka, you need at least three certificate files:
- cafile. A Certificate Authority (CA) file used in certificate verification. This corresponds to the ssl_cafile parameter in Kafka.
- certfile. A client certificate file (.pem format) and any files needed to verify the certificate's authenticity. This corresponds to the ssl_certfile parameter in Kafka.
- keyfile. Client private key. This corresponds to the ssl_keyfile parameter in Kafka.
Publisher ingests the certificate files as Base64 encoded file contents. Prior to moving to the next step, use a command such as the following to encode and output the contents of each file. You will copy and paste these contents in the next procedure.
cat <file_path>/<file_name> | base64 -w 0
Publisher does not support the use of newline in any of the certificate files. The command above encodes the file without newlines, making it one continuous string.
Subscription Example
The following is an example of a subscription configured to connect to a Kafka server through an SSL connection:
apiVersion: publisher.sl1.io/v1alpha1
kind: Subscription
metadata:
name: <subscription_name>
spec:
address: kafka://<kafka_address>:<kafka:port>/<dest_topic>
dataModels:
- <desired datamodels>
config:
sink:
security_protocol: "SSL"
ssl_check_hostname: false
ssl_cafile: "LS0t…tLS0K"
ssl_certfile: "LS0t…Qo="
ssl_keyfile: "Qm…LS0K"
The fields ssl_cafile, ssl_certfile, and ssl_keyfile are Base64 encoded strings of the respective files.
For example, if we have a file called CARoot.pem that stores our Certificate Authority root certificate, we can convert that into a Base64 encoded string (such as, LS0t...tLSOK shown in the example subscription above) and use that Base64 encoded string in the ssl_cafile field in the sink configuration for our subscription.
The ssl_check_hostname parameter might need to be set to true, depending on how the destination Kafka server and its respective certificates are configured.