Azure Eventhub Diagnostic Logs

Azure Eventhub Diagnostic Logs

Pull diagnostic logs from Azure Event Hub to Logstash

Follow the steps below to send your observability data to Logit.io

Logs

Install Integration

Please click on the Install Integration button to configure your stack for this source.

Prerequisites

Before you begin you will need to ensure you have an available Azure Event Hub in your Azure Portal (opens in a new tab)

Confirm you have the following:

  • An Eventhub you wish to get diagnostic Logs from.
  • A separate Eventhub you wish to stream the diagnostic logs to.

The Azure event hub Logstash plugin is only available for stacks running Logstash 6.4 onwards

Configure Diagnostic Logs

Once you have confirmed you have everything required it's time to configure diagnostic logging.

In the Eventhub Namespace you wish to get diagnostic logs from you need to browse to the Diagnostic Settings from the left hand menu.

Diagnostic settings

Go ahead and choose add diagnostic setting, enter a suitable name and then select the diagnostic logs you need.

Select stream to an Eventhub and enter the details of the second Eventhub.

Once you're happy with the settings select save.

Save changes

Configure Permissions

Once you have data streaming to your Azure event hub, it is recommended to create a Consumer Group specifically for Logstash and not to reuse any default or existing groups.

The Logstash input supports multiple event hubs - the connection string for each hub can be found in the Azure Portal (opens in a new tab) -> Event Hub -> Shared access policies.

example connection string
Endpoint=sb://<youreventhubnamespace>.servicebus.windows.net/;SharedAccessKeyName=<yoursharedaccesspolicyname>;SharedAccessKey=<yoursharedaccesskey>;EntityPath=<youreventhubname>

A blob storage account is used to preserve state across Logstash reboots. The Storage account connection string can be found in the Access Keys section under the Storage Account Settings menu in the Azure Portal (opens in a new tab)

example connection string
 
DefaultEndpointsProtocol=https;AccountName=<storage-account-name>;
AccountKey=<storage-account-key>;
EndpointSuffix=core.windows.net

Start Sending Logs To A Stack

To start pulling logs and metrics from the Azure Event Hub to your Stack you need to configure an Azure Logstash Input on your Logit.io Stack.

Go to Dashboard

Logit.io will verify your input before it is applied, we will contact you to confirm when this has been completed.

Check Logit.io for your logs

Data should now have been sent to your Stack.

View My Data

If you don't see take a look at How to diagnose no data in Stack below for how to diagnose common issues.

How to diagnose no data in Stack

If you don't see data appearing in your stack after following this integration, take a look at the troubleshooting guide for steps to diagnose and resolve the problem or contact our support team and we'll be happy to assist.

Winlogbeat Installation Instructions Overview

Active Directory is a Microsoft technology that provides a centralized database for storing information about users, computers, and other resources on a network. It is widely used in enterprise environments to manage user accounts, permissions, and access to resources.

To effectively monitor and analyze security events in an Active Directory environment, it is essential to have a reliable and efficient log management solution. One way to achieve this is by using winlogbeat, an open-source log shipper that can be installed on a Windows server to collect and ship event logs to Logstash.

Winlogbeat is designed to work with the Elastic Stack, which includes Elasticsearch, Logstash, and OpenSearch. By sending event logs from Active Directory to Logstash, organizations can centralize their log data and analyze it in real-time for threat detection and response.

In summary, using winlogbeat to ship logs from a Windows server to Logstash can provide organizations with a reliable and efficient way to monitor and analyze security events in their Active Directory environment. By centralizing log data and analyzing it in real-time, organizations can detect and respond to security threats quickly and effectively.

If you need any further assistance with migrating your Azure data to Logstash we're here to help you discover the insights locked in data hosted on Azure, GCP, AWS or any of the integrations covered in our data sources. Feel free to reach out by contacting our support team via live chat and we'll be happy to assist.