Author:
LSEG Data Platform (Data Platform) is a cloud-enabled, open platform that brings together content, analytics, and proprietary, customer, and third-party data. It exposes that content through web-based APIs, so applications written in languages such as Python can authenticate, request data, and consume Data Platform services programmatically.
This article demonstrates the alerts workflow for retrieving news headlines or stories from Data Platform in Python. Rather than using a synchronous request-response call for each update, the alerts service lets an application create a subscription and receive matching events asynchronously.
Data Platform supports multiple delivery mechanisms depending on the content set, including Request/Response, Alerts or Messages-Services, Bulk, and Streaming. For news alerts, the platform uses the Alerts or Messages-Services model together with Amazon SQS, allowing subscribed applications to poll a queue for updates that match the subscription criteria.
The sections below walk through the end-to-end process: authenticate, create a news subscription, obtain temporary cloud credentials, poll and decrypt queue messages, and finally remove the subscription when processing is complete.
Prerequisites
Before running this notebook, ensure the following are in place:
- LSEG Data Platform account — A valid LSEG Data Platform account with credentials (username/password for V1 authentication, or client ID/secret for V2 client credentials flow) and access to the News Alerts service.
- Python environment — Python 3.7 or later is recommended. A virtual environment (e.g. `venv` or `conda`) is advised to isolate dependencies.
- Python dependencies — Install the required packages before executing any cells:
pip install -r requirements.txt
Key packages include `requests` (HTTP calls to Data Platform), `boto3` (Amazon SQS polling), and `pycryptodome` (AES-GCM message decryption).
Import libraries and define global variables
This notebook follows the LSEG Data Platform alerts workflow for news delivery over the Alerts or Messages-Services mechanism.
In this model, the application authenticates to Data Platform, creates a news subscription, receives an Amazon SQS queue endpoint plus a cryptographyKey, requests temporary cloud credentials for that queue, and then polls the queue for encrypted messages.
The global variables in this notebook hold two kinds of state: alert subscription details such as endpoint, cryptographyKey, and subscriptionID, and temporary queue access credentials such as accessKeyId, secretKey, sessionToken, and cloudEndPoint. Install pycryptodome if AES-GCM decryption is not already available in your environment.
import requests
import json
import boto3
import base64
from Crypto.Cipher import AES #pip install pycryptodome
REGION = 'us-east-1'
#Alerts variables
endpoint = ""
cryptographyKey = ""
subscriptionID = ""
#cloud variables
accessKeyId = ""
secretKey = ""
sessionToken = ""
cloudEndPoint = ""
1. Get an access token
Authentication is the first step for all Data Platform API requests. The platform uses OAuth 2.0, and the bearer token returned from the authentication endpoint is then passed to the alerts subscription and cloud-credentials endpoints.
This notebook shows two common approaches: a username and password flow for the V1 authentication, and a client credentials flow for the V2 authentication. Only one valid access token is needed for the later steps.
1.1 V1 Authentication (Username, Password, and ApplicationKey)
This flow posts credentials to the OAuth v1 token endpoint with grant_type=password and scope=trapi. If the request is successful, the response contains an access_token that can be reused in the following REST calls.
This pattern is aligned with the login step described in the alerts articles, where user credentials and an application identifier are exchanged for a token that authorizes access to the News alerts services.
username = "<username>"
password = "<password>"
application_key = "<appkey>"
url = "https://api.refinitiv.com/auth/oauth2/v1/token"
payload = 'username='+username+'&password='+password+'&grant_type=password&scope=trapi&takeExclusiveSignOnControl=true&client_id='+application_key
headers = {
'Content-Type': 'application/x-www-form-urlencoded'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
json_obj = json.loads(response.text)
if "access_token" in json_obj:
access_token = json_obj["access_token"]
else:
access_token = ""
print("Error: No Access Token")
print(access_token)
1.2 V2 Authentication (Client ID and Client Secret)
This alternative uses the OAuth v2 client credentials flow. It is useful when the application authenticates as a client rather than as an interactive user.
The result is still a bearer token used in exactly the same way as the v1 token: include it in the Authorization header when creating subscriptions and when requesting temporary queue credentials.
client_id = "<client_id>"
client_secret = "<client_secret>"
url = "https://api.refinitiv.com/auth/oauth2/v2/token"
payload = 'grant_type=client_credentials&client_id='+client_id+'&client_secret='+client_secret+'&scope=trapi'
headers = {
'Content-Type': 'application/x-www-form-urlencoded'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
json_obj = json.loads(response.text)
if "access_token" in json_obj:
access_token = json_obj["access_token"]
else:
access_token = ""
print("Error: No Access Token")
print(access_token)
2. News Subscription
News alerts use the asynchronous Alerts or Messages-Services delivery mechanism. Instead of returning a full result set immediately, the platform creates a cloud queue and pushes matching updates into that queue as they become available.
At subscription time, the application chooses the alert type, provides the transport definition, and can add filters such as language, freetext, or news codes. AWS-SQS is the transport used by the alerts examples and is currently the supported delivery target described in the articles.
2.1 News Headlines Subscription
A headlines subscription returns headline and metadata updates only. In the workflow described by LSEG, the subscription request asks the service to create an AWS SQS queue and start publishing messages that match the requested criteria.
The response typically contains three values needed later in the notebook: the queue endpoint, the cryptographyKey used to decrypt message bodies, and the subscriptionID used to terminate the subscription. This example also applies an English-language and TOP NEWS filter before requesting payloadVersion 2.0.
isHeadlinesSubscription = True
url = "https://api.refinitiv.com/alerts/v1/news-headlines-subscriptions"
# request = {
# "transport": {
# "transportType": "AWS-SQS"
# }
# }
request = {
"transport": {
"transportType": "AWS-SQS"
},
"filter": {
"type": "operator",
"operator": "and",
"operands": [
{
"type": "freetext",
"match": "contains",
"value": "TOP NEWS",
"flags": [
"headline"
]
},
{
"type": "language",
"value": "L:en"
}
]
},
"payloadVersion": "2.0"
}
payload = json.dumps(request)
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer '+access_token
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
json_obj = json.loads(response.text)
endpoint = json_obj["transportInfo"]["endpoint"]
cryptographyKey = json_obj["transportInfo"]["cryptographyKey"]
subscriptionID = json_obj["subscriptionID"]
print("\nEndpoint: "+endpoint)
print("cryptographyKey: "+cryptographyKey)
print("subscriptionID: "+subscriptionID)
2.2 News Stories Subscription
A stories subscription requests story-oriented updates rather than headline-only alerts. Depending on message size, the decrypted payload may contain the story content directly in payload, or it may provide an href claim check that the application can follow to retrieve larger content from cloud storage.
As with headlines, the response returns the queue endpoint, the cryptography key, and the subscription identifier. The filter in this notebook shows how the request can be narrowed to a specific news code.
isHeadlinesSubscription = False
url = "https://api.refinitiv.com/alerts/v1/news-stories-subscriptions"
# request = {
# "transport": {
# "transportType": "AWS-SQS"
# }
# }
# request = {
# "transport": {
# "transportType": "AWS-SQS"
# },
# "filter": {
# "type": "operator",
# "operator": "and",
# "operands": [
# {
# "type": "freetext",
# "match": "contains",
# "value": "TOP NEWS",
# "flags": [
# "headline"
# ]
# },
# {
# "type": "language",
# "value": "L:en"
# }
# ]
# },
# "payloadVersion": "2.0"
# }
request = {
"transport":{
"transportType":"AWS-SQS"
},
"filter":{
"value":"R:EUR=",
"type":"newscode"
},
"payloadVersion": "2.0"
}
payload = json.dumps(request)
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer '+access_token
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
json_obj = json.loads(response.text)
endpoint = json_obj["transportInfo"]["endpoint"]
cryptographyKey = json_obj["transportInfo"]["cryptographyKey"]
subscriptionID = json_obj["subscriptionID"]
print("\nEndpoint: "+endpoint)
print("cryptographyKey: "+cryptographyKey)
print("subscriptionID: "+subscriptionID)
3. Get Cloud Credentials
The queue is owned by the platform, so the application must request temporary cloud credentials before it can poll Amazon SQS. This is done by calling the cloud-credentials endpoint and passing the queue endpoint returned from the subscription step.
The response contains accessKeyId, secretKey, sessionToken, and the queue endpoint. These credentials are temporary and, as described in the delivery-mechanism article, expire every hour, so long-running applications should refresh them periodically.
url = "https://api.refinitiv.com/auth/cloud-credentials/v1/?endpoint="+endpoint
payload = {}
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer '+access_token
}
response = requests.request("GET", url, headers=headers, data=payload)
print(response.text)
json_obj = json.loads(response.text)
accessKeyId = json_obj["credentials"]["accessKeyId"]
secretKey = json_obj["credentials"]["secretKey"]
sessionToken = json_obj["credentials"]["sessionToken"]
cloudEndPoint = json_obj["endpoint"]
print("\naccessKeyId: "+accessKeyId)
print("secretKey: "+secretKey)
print("sessionToken: "+sessionToken)
print("cloudEndPoint: "+cloudEndPoint)
4. Retrieve and decrypt messages
After the queue credentials are available, the application polls Amazon SQS, receives any waiting messages, decrypts each message body, and deletes processed messages from the queue to avoid duplicates. Any message that is not retrieved from queue, will automatically expire after 14 days, and be lost forever.
Alert messages are base64-encoded and encrypted with AES-256 in GCM mode. The decryption process uses the subscription cryptographyKey, extracts the AAD, NONCE, and TAG from the encoded message, and then produces a JSON payload. That decrypted JSON may contain either an inline payload or an href to large content, along with timestamps and subscription metadata.
#==============================================
def decrypt(key, source):
#==============================================
GCM_AAD_LENGTH = 16
GCM_TAG_LENGTH = 16
GCM_NONCE_LENGTH = 12
key = base64.b64decode(key)
cipherText = base64.b64decode(source)
aad = cipherText[:GCM_AAD_LENGTH]
nonce = aad[-GCM_NONCE_LENGTH:]
tag = cipherText[-GCM_TAG_LENGTH:]
encMessage = cipherText[GCM_AAD_LENGTH:-GCM_TAG_LENGTH]
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
cipher.update(aad)
decMessage = cipher.decrypt_and_verify(encMessage, tag)
return decMessage
This code creates an AWS SQS client with the temporary cloud credentials, polls the queue for up to max_message_count messages, decrypts each message body with the cryptographyKey, prints the decoded JSON payload, and then deletes the processed message from the queue to avoid duplicate processing.
message_count = 0
max_message_count = 20
session = boto3.Session(
aws_access_key_id = accessKeyId,
aws_secret_access_key = secretKey,
aws_session_token = sessionToken,
region_name = REGION
)
sqs = session.client('sqs', verify=True)
print('Polling messages from queue...')
while 1:
if message_count >= max_message_count:
print("### Max count reached ###")
break
resp = sqs.receive_message(QueueUrl = endpoint, WaitTimeSeconds = 10)
if 'Messages' in resp:
messages = resp['Messages']
else:
print("No message")
messages = []
# print and remove all the nested messages
for message in messages:
message_count = message_count + 1
mBody = message['Body']
# decrypt this message
m = decrypt(cryptographyKey, mBody)
pl = json.loads(m)
print(json.dumps(pl, indent=2))
sqs.delete_message(QueueUrl = endpoint, ReceiptHandle = message['ReceiptHandle'])
5. Delete a Subscription
When message consumption is complete, the application should delete the subscription by calling the corresponding alerts endpoint with the subscriptionID. This stops new messages from being delivered for that subscription and matches the final cleanup step described in the alerts workflow.
Use the headlines delete endpoint for headline subscriptions and the stories delete endpoint for story subscriptions.
if isHeadlinesSubscription == True:
url = "https://api.refinitiv.com/alerts/v1/news-headlines-subscriptions?subscriptionID="+subscriptionID
else:
url = "https://api.refinitiv.com/alerts/v1/news-stories-subscriptions?subscriptionID="+subscriptionID
payload = {}
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer '+access_token
}
response = requests.request("DELETE", url, headers=headers, data=payload)
print(response)
print(response.text)
Summary
This notebook demonstrates the complete workflow for consuming news alerts from LSEG Data Platform using Python. The process consists of six key steps:
- Authentication: Obtain an OAuth 2.0 access token using either the v1 password-based flow or the v2 client credentials flow.
- News Subscription: Create a headline or story subscription by specifying the transport type (AWS-SQS), optional filters (language, freetext, news codes), and payload version. The platform responds with a queue endpoint, cryptography key, and subscription ID.
- Cloud Credentials: Request temporary cloud credentials to access the AWS SQS queue. These credentials expire hourly and must be periodically refreshed for long-running applications.
- Polling and Message Retrieval: Use boto3 to connect to the SQS queue with the provided credentials and poll for messages. Delete messages after processing to avoid duplicates. Unprocessed messages automatically expire after 14 days.
- Message Decryption: Decrypt the base64-encoded message body using AES-256 with GCM mode. The decrypted JSON payload contains either inline news content or an href claim check for large payloads stored in cloud storage.
- Subscription Cleanup: Delete the subscription by passing the subscription ID to the appropriate alerts endpoint, stopping further message delivery.
The alerts delivery mechanism provides an asynchronous, event-driven pattern for accessing news and research updates from Data Platform, enabling real-time applications, news blotters, and monitoring dashboards to stay current with breaking content.
References
The notebook content and workflow are based on the following LSEG Developer articles and API resources:
- Retrieving News Headlines or Stories from Refinitiv Data Platform Alerts with Java
- Message services delivery mechanism in RDP
- LSEG Data Platform APIs
These references describe the authentication flow, alerts subscription model, AWS SQS delivery pattern, cloud credentials retrieval, message decryption process, and subscription cleanup used throughout this notebook.
- Register or Log in to applaud this article
- Let the author know how much this article helped you
Get In Touch
Source Code
Related APIs
Related Articles
Request Free Trial
Call your local sales team
Americas
All countries (toll free): +1 800 427 7570
Brazil: +55 11 47009629
Argentina: +54 11 53546700
Chile: +56 2 24838932
Mexico: +52 55 80005740
Colombia: +57 1 4419404
Europe, Middle East, Africa
Europe: +442045302020
Africa: +27 11 775 3188
Middle East & North Africa: 800035704182
Asia Pacific (Sub-Regional)
Australia & Pacific Islands: +612 8066 2494
China mainland: +86 10 6627 1095
Hong Kong & Macau: +852 3077 5499
India, Bangladesh, Nepal, Maldives & Sri Lanka:
+91 22 6180 7525
Indonesia: +622150960350
Japan: +813 6743 6515
Korea: +822 3478 4303
Malaysia & Brunei: +603 7 724 0502
New Zealand: +64 9913 6203
Philippines: 180 089 094 050 (Globe) or
180 014 410 639 (PLDT)
Singapore and all non-listed ASEAN Countries:
+65 6415 5484
Taiwan: +886 2 7734 4677
Thailand & Laos: +662 844 9576