Machine Learning based Signal Processing in Real Time
Meta Logos Systems, LLC, has distributed processing versions of all of the core ML tools (including HMMs and SVMs). This enables real-time signal processing to meet the real-time needs of applications by use of a scalable network of (inexpensive) computers.

Machine Learning (ML) methods for ND and NTD Channel Current Cheminformatics
The signal processing methods favored for the channel current blockade signals, due to their stochastic nature in both background noise and the modulation of the signal itself (not simple periodic but has a statistical profile that is stationary – referred to as having ‘stationary statistics’). are Machine Learning based. Most notably in this regard is the use of Hidden Markov Models (HMMs) for signal feature extraction. It turns out, however, that acquisition of signal by Finite State Automaton (FSA), with use of digital signal processing (DSP) methods is much more efficient, and later classification and clustering is best done by Support Vector Machines (using the HMM feature extraction) especially when ‘weak’ data can be dropped in a streaming event processing scenario (e.g., a sequence of channel blockade measurements as occurs here). In addition to use of methods involving DSP, FSAs, HMMs, SVMs, there is also the ‘glue’ of a collection of search/optimization heuristics and a collection of metrics/measures, such as Shannon entropy and relative entropy.
The Stochastic Sequential Analysis (SSA) Protocol
The use of the collection of information theory tools and search metaheuristics, and the core DSP. FSA, HMM, and SVM methods are used in a non-workflow process (by an expert) to arrive at a workflow process for the bulk of the signal processing (99%) on the class of blockade signals being considered. From this initial setup there results a work-flow process for signal processing that must still adjust overall, periodically (due to ND/NTD device drift), which introduces periodic need fur further non-workflow processes to optimize. A work-flow process can be managed with reasonable adaptability with or without introduction of modern AI/LLM capabilities. For a non-workflow process, however, an Agentic AI capability is needed (which now exists). In other words, the SSA Protocol requires Agentic AI to automate, and thereby enable real-time ‘nanoscope’ or NTD capabilities.
The SSA Protocol Agentic Gap can be solved with current Agentic AI Capabilities
Identifying the need for AI Agency to close the gap on some task does not immediately impart a functional, safe, trusted AI Agency to close that gap. Far from it (with current 2025 AI capabilities). At issue is that the AI Agency must still be implemented with ‘guard-rails’ and extensive verification at every stage to be ‘safe’. For this to occur, there must exist performance metrics and other metrics to verify calculations, such as exists in the SSA Protocol via its variety of information theory and search tools. In other words, an entire panoply of validation tools and metrics must be used to establish guard-rails for the AI, so the process envisaged must already be ‘mapped out’ in the sense that it is already established by a (human) agency through prior practice establishing such signal processing pipelines. This has been done for the SSA Protocol, leading to different work-flows according to the data as described in [10]. Thus, the SSA Protocal has an Agentic Gap in its performance, but has the critical tools with which to close that Agentic Gap in the ‘hands’ of an Agentic AI.