Intelligence Service
Intelligence Service is responsible with providing DataSapien's patented three tier on-edge intelligence approach. Intelligence Service allows access to:
- Rules - these are first level of intelligence and static in nature
- ML Models - conventional probabilistic models designed to work in a specific field
- AI Models - Generative AI Models designed to work in a wide range of fields
Rules
Rules are designed on DataSapien Orchestrator and deployed to Mobile SDK instances. Mobile SDK evaluates rules on various times during app lifecycle:
- Upon collection / change of MeData values, this is triggered by MeData Service
- When SDK is initialised by host application
- When Intelligence Service Rule evaluation is called by scripts
- When Intelligence Service Rule evaluation is called by your host app
Note that rule evaluation may be limited or completely impossible when your host app is in the background because of the limitations emposed by mobile operating systems.
Using ML Models
To use an ML model, first you must provision it on DataSapien Orchestrator. Each ML model has a unique programmatic name. You need to provide its unique name to Intelligence Service to invoke an ML Model.
If there is no available model for the name you provide, Intelligence Service will return an error indicating this. You can query and start download of ML models using Intelligence Service functions.
You can invoke ML model in one of two ways:
- Directly from your host application: In this case Mobile SDK is just the delivery channel and wrapper for the ML models
- From Scripts: In this case you can use ML models in any Mobile SDK provided use-case including Journeys & Exchanges.
Using AI Models
Simiary to ML models, AI models need to be provisioned on DataSapien Orchestrator first. Again, like ML models, each AI model has a unique programmatic name.
Invoking AI models are similar to invoking ML models, directly from your host app or via scripts.
Halucinations & Fact Checking
AI models, because of their generative nature, are prone to halucinations. DataSapien architecture allow you fact check AI output by:
- Writing scripts to implement simple accept / eject algorithms
- Using an ML model: feeding AI output to an ML model for fact checking
- Using an additional AI Model: feeding first AI output to an additional AI model
Intelligence Service Functions
To access IntelligenceService
functions; get its instance from DataSapien
object: DataSapien.getIntelligenceService()
.
Check if model downloaded on device####
Checks if model is downloaded or not.
- Swift
- Kotlin
// Signature
public func isModelDownloaded(modelName: String) -> Bool
// Usage
DataSapien.getIntelligenceService.isModelDownloaded("llama-3.2")
// TBD...
Download & Load model
Downloads the given model, if its already downloaded it loads.
- Swift
- Kotlin
// Signature
public func load(modelName: String, status: @escaping @Sendable(Double) -> (), completion: @escaping @Sendable(ModelContainer) -> () , error: @escaping @Sendable(Error)->())
// TBD...
Downloaded model list
Returns model list that downloaded into host application.
- Swift
- Kotlin
// Signature
public func getDownloadedModelsList() -> [String]
// TBD...
Invoke model
Invokes the model with prompt.
- Swift
- Kotlin
// Signature
public func invoke(modelName: String, systemPrompt: String , streaming:@escaping @Sendable(String) -> () , completion:@escaping @Sendable(String) -> () , error:@escaping @Sendable(Error) -> ())
// TBD...
Unload/stop model
Stops and unloads the model from memory
- Swift
- Kotlin
// Signature
public func stop()
// TBD...