Sign Up via LinkedIn Newsletters
-----------------------------------------
A weekly Newsletter on technology applications in investment management with an AI / LLM and automation angle. Curated news, announcements, and posts, primarily directly from sources. We apply some of the ROI's Kubro(TM) Engine tools at the backend for production, yet with a human in the loop (for now). It's a start, and it will evolve on a weekly basis.
Disclaimers: content not-fully-human-verified, with AI summaries below. AI/LLMs may hallucinate and provide inaccurate summaries. Select items only, not intended as a comprehensive view. For information purposes only. Please DM with feedback and requests.
Picking a select few (no review) from the most recent submissions, eyeing potentially interesting bits on LLMs and AI in the finance/investment world.
The article titled "News-Aware Direct Reinforcement Trading for Financial Markets," submitted on October 22, 2025, by Qing-Yu Lan and four co-authors, addresses the challenge of incorporating news data into quantitative trading. The authors propose using news sentiment scores derived from large language models combined with raw price and volume data as inputs for reinforcement learning models. These inputs are processed by sequence models such as recurrent neural networks or Transformers to make end-to-end trading decisions. Experiments conducted in the cryptocurrency market evaluate two reinforcement learning algorithms: Double Deep Q-Network (DDQN) and Group Relative Policy Optimization (GRPO). The results show that this news-aware approach outperforms market benchmarks without relying on handcrafted features or manually designed rules. The study also emphasizes the importance of time-series information in achieving superior trading performance. 🔗 Source: arxiv.org | Found on Oct 23, 2025
Submitted on 18 Oct 2025, Abraham Atsiwo’s arXiv:2510.16636 (DOI 10.48550/arXiv.2510.16636) presents a three-step machine learning framework to predict bubbles in the S&P 500 by integrating financial news sentiment with macroeconomic indicators. Step 1 identifies bubble periods using a right-tailed unit root test, a widely recognized real-time detection method. Step 2 extracts sentiment features from large-scale financial news via natural language processing to capture investors’ expectations and behavioral patterns. Step 3 employs ensemble learning to predict bubble occurrences using sentiment-based and macroeconomic predictors. Performance is evaluated with k-fold cross-validation and benchmarked against other machine learning algorithms. Empirical results show significantly improved predictive accuracy and robustness, offering early warning insights for investors, regulators, and policymakers to help mitigate systemic financial risks. 🔗 Source: arxiv.org | Found on Oct 23, 2025
Submitted on 23 Oct 2025, Haonan Bian’s survey “LLM-empowered knowledge graph construction” (arXiv:2510.20345; https://doi.org/10.48550/arXiv.2510.20345) reviews how Large Language Models shift Knowledge Graph construction from rule-based/statistical pipelines to language-driven, generative frameworks. It analyzes the classical three-layer pipeline—ontology engineering, knowledge extraction, and knowledge fusion—through schema-based (structure, normalization, consistency) and schema-free (flexibility, adaptability, open discovery) paradigms, synthesizing frameworks and their mechanisms and limitations. The paper highlights future directions: KG-based reasoning for LLMs, dynamic knowledge memory for agentic systems, and multimodal KG construction. 🔗 Source: arxiv.org | Found on Oct 24, 2025
The article titled "Comparing LLMs for Sentiment Analysis in Financial Market News," submitted on October 3, 2025, by Lucas Eduardo Pereira Teles and Carlos M. S. Figueiredo, presents a comparative study of large language models (LLMs) applied to sentiment analysis in financial market news. The study evaluates the performance differences between LLMs and classical approaches within this natural language processing task specific to finance. The results demonstrate that large language models outperform classical models in the vast majority of cases, quantifying the benefits of each tested model or approach. 🔗 Source: arxiv.org | Found on Oct 23, 2025
Submitted on 20 Oct 2025, the paper by Kefan Chen, Hussain Ahmad, Diksha Goel, and Claudia Szabo introduces 3S-Trader, a training-free multi-LLM framework for portfolio construction. It addresses limitations of single-stock-focused methods and inflexible strategies by integrating three modules: scoring, which summarizes each stock’s recent signals across multiple dimensions; strategy, which analyzes historical strategies and overall market conditions to iteratively generate an optimized selection strategy; and selection, which assembles portfolios by choosing stocks with higher scores in relevant dimensions. Evaluated across four stock universes, including Dow Jones Industrial Average constituents and three sector-specific sets, 3S-Trader outperforms existing multi-LLM and time-series baselines, achieving an accumulated return of 131.83% on DJIA constituents with a Sharpe ratio of 0.31 and a Calmar ratio of 11.84, and delivering consistently strong results in other sectors. 🔗 Source: arxiv.org | Found on Oct 23, 2025
Actions, news, and announcements by investment firms, in the context of technology applications.
Three Two Sigma leaders appeared on recent podcasts, emphasizing balancing speed with rigor, technology as an enabler of human creativity, and continuous adaptation to evolving platforms. Effie Baram (Data Engineering Podcast) described a shift from shipping only production-ready datasets to offering both raw and curated data, standardized on BigQuery with transformations as code and CI/CD, and “ice cube” data contracts covering 90% of use cases. Ben Wellington (TWIML AI Podcast) said LLMs can compress feature engineering from months to minutes—e.g., testing whether a CEO touching their nose predicts stock moves—while stressing point-in-time, open-source models to avoid temporal leakage and exploring orthogonal signals from thousands of AI agents. Matt Greenwood (Dev Interrupted) reflected on 20+ years, from yanking hard drives out of vendor machines to managing millions of lines of code, framing innovation as stacked S-curves, advocating an “epsilon and omega” approach, and defining AI roles: advisory, oracle, operational, and agentic. 🔗 Source: twosigma.com | Found on Oct 23, 2025
The article by Ola Mahmoud and Andreas Mütter (October 24, 2025) states FX forecasting resists static models, citing Meese and Rogoff (1983) showing structural macro models underperform a naive random walk even with future macro inputs. It proposes sequence learning via LSTMs with attention that ingest about 100 time series features, and ensembles of 100 LSTMs (e.g., for AUD/EUR). Live backtests show monthly forecast accuracy aligns with realized returns; ensemble strategies exhibit ~30% average pairwise correlation, positive pairwise alpha, and a no shorting, equalized-volatility tangency portfolio outperforming any single model. 🔗 Source: am.vontobel.com | Found on Oct 24, 2025
On October 21, 2025, Meta and funds managed by Blue Owl Capital formed a joint venture to develop and own the Hyperion data center campus in Richland Parish, Louisiana. Blue Owl funds will own 80% and Meta 20%, with approximately $27 billion in total development costs funded pro rata. Meta contributed land and construction-in-progress assets; Blue Owl contributed about $7 billion in cash, and Meta received a one-time distribution of about $3 billion. Meta will provide construction and property management and has entered operating leases for all facilities with a four-year initial term and extension options, plus a residual value guarantee for the first 16 years. Construction is underway with thousands of workers and, once online, will support over 500 operational jobs. Part of Blue Owl’s capital will be debt issued to PIMCO and other bond investors. Morgan Stanley advised Meta and was sole bookrunner. 🔗 Source: about.fb.com | Found on Oct 21, 2025
Investment Manager Josh Sambrook-Smith reports that AI investment commitments will exceed a trillion dollars, led by OpenAI’s pledge of over USD 1 trillion through 2030 across multiple infrastructure providers, alongside multi-billion dollar commitments from Facebook, Anthropic, and xAI. Early beneficiaries include Nvidia, AMD, and the broader GPU supply chain, though margins are currently lower than the market expected due to heavy spending and limited revenues. AI revenues now exceed USD 18 billion annually, up from roughly USD 1 billion a year ago, adding nearly USD 20 billion in a year, across firms such as Microsoft, OpenAI, Anthropic, and xAI. OpenAI’s developer conference announced a free social-media video generation app, Soar AI, for the App Store. The team is invested in Google (Gemini), Nvidia, Broadcom (custom high-speed chips for Amazon, Google, Microsoft), and semi-cap equipment makers, while avoiding single-component suppliers reliant on Nvidia. 🔗 Source: gam.com | Found on Oct 23, 2025
Focused on specific company types in financial information, alternative data, software solutions, and AI / LLM technologies.
On Oct. 21, 2025, Dataminr announced its intent to acquire ThreatConnect, valuing ThreatConnect at $290 million, to fuse Dataminr’s AI-powered public data signals with ThreatConnect’s internal data capabilities into Agentic AI-powered Client-Tailored intelligence. Dataminr Founder and CEO Ted Bailey highlighted ThreatConnect’s 170-person team and the goal of delivering real-time, context-aware, personalized intelligence that adapts to customers’ needs. ThreatConnect serves 250 enterprise and government organizations, including one-third of the Fortune 50, four of the five largest tech companies, and enterprises such as Natwest, Nike, Wells Fargo, Wyndham Hotels, and General Parts Corporation, as well as agencies in the U.S., UK, and Australia. Dataminr Pulse for Cyber Risk and ThreatConnect will become a joint offering with continued support and combined enhancements. Dataminr’s platform processes text in 150 languages and signals from 1M public data sources and is trusted by more than 100 U.S. government agencies, 20 international governments, and two-thirds of the Fortune 50. 🔗 Source: prnewswire.com | Found on Oct 22, 2025