Last updated news for get_news

Hi all,

I want to get the last 20 mins news for a stock. What is the last update for get_news method?
Is it real time?

Thanks

@huseyin Take a look at the news endpoint. You can get news articles between any two datetimes and includes current news. If a end is not specified, the default value is the current time. One thing to note is the maximum articles returned is 50 so you should check the page token and request more pages if it is not empty. May not be an issue with only the last 20 minutes but something to be aware of. I can provide a code sample in python if needed.

Thank you so much.
But the documentation you shared is not clear.
I am using the following script:
now = datetime.now()
five_mins_ago = now - timedelta(minutes=20)
news_start_time = pd.to_datetime(five_mins_ago).tz_localize(‘America/New_York’).isoformat()
news_end_time = pd.to_datetime(now).tz_localize(‘America/New_York’).isoformat()
news_last_20_mins = rest_client.get_news(‘TSLA’, news_start_time, news_end_time)

If nothing wrong with it, I will stick to my script.
Please let me know
Thanks

I personally do not like using the datetime methods and prefer only using the pandas methods. So, here is my implementation.

!pip install -q alpaca-py
import pandas as pd

from alpaca.data.historical.news import NewsClient
from alpaca.data.requests import NewsRequest

API_KEY = 'xxxxx'
API_SECRET_KEY = 'xxxxx'
news_client = NewsClient(api_key=API_KEY, secret_key=API_SECRET_KEY)

current_time = pd.Timestamp.now(tz="America/New_York")
time_20_minutes_previous = current_time - pd.Timedelta(minutes=20)

news_last_20_mins = news_client.get_news(NewsRequest(
    symbols='TSLA',
    start=time_20_minutes_previous,
    end=current_time,
    limit=50)
)

Note this will only return the first 50 articles. This probably isn’t an issue if only fetching news for 20 minutes for a single symbol, but if more symbols or a longer timeframe is requested, one should probably implement some sort of loop to check for multiple pages and fetch those as necessary.

Thank you so much
I appricate