How to extend CAMS with triggers, packages, and data managers
The main purpose of the CARP Mobile Sensing framework is to allow for extension with domain-specific studies, triggers, tasks, measures, probes, and data types.
This is done by implementing classes that inherit from the (abstract) classes in the library.
Add Triggers
Define custom triggers to be used as part of a study protocol.
Add Sampling Packages
Introduce new measures, probes, data classes, and optional device managers.
Add Data Managers
Support custom storage or upload of measurements via new endpoints and data managers.
Add Data Transformers
Creating data and privacy transformer schemas for data re-formatting and privacy control.
Extension to CAMS often rely on serialization to/from JSON - this applies for triggers, measures, sampling schemas, data, etc.
CAMS relies on polymorphic serialization as supported by the carp_serializable package, which is an extension to the standard json_serializable provided in Dart.Support for polymorphic JSON serialization implies four steps when designing your classes:
1
Define a serializable class
Extend from Serializable and annotate the class:
@JsonSerializable(includeIfNull: false, explicitToJson: true)class FreeMemory extends Data { ...}
2
JSON serializable methods
Implement the three JSON serializable methods:
@override Function get fromJsonFunction => _$FreeMemoryFromJson; factory FreeMemory.fromJson(Map<String, dynamic> json) => FromJsonFactory().fromJson<FreeMemory>(json); @override Map<String, dynamic> toJson() => _$FreeMemoryToJson(this);
3
Registration
Register the class’s fromJsonFunction in the FromJsonFactory, e.g. in the onRegister() callback function of a sampling package:
When all JSON classes have been implemented, the final step is to generate the JSON serialization helper classes.
These can be generated using build_runner, by running the following command in the root of your Flutter project:
dart run build_runner build --delete-conflicting-outputs
Triggers are a central part of a StudyProtocol and CAMS allows creating your own triggers and add them to the framework. This is done by the following steps:
Define one or more new Triggers.
Define a TriggerExecutor for each new trigger.
Define and register a TriggerFactory that knows how to create the correct TriggerExecutor based on a specific trigger.
Any trigger should extend the TriggerConfiguration class and implement domain-specific fields that configure this trigger. In the example below, we have defined a RemoteTrigger that listens to resources on a server identified by a URI, and triggers when this resource is available.
/// A trigger that triggers based on event from a remote server.@JsonSerializable(includeIfNull: false, explicitToJson: true)class RemoteTrigger extends TriggerConfiguration { RemoteTrigger({ required this.uri, this.interval = const Duration(minutes: 10), }) : super(); /// The URI of the resource to listen to. String uri; /// How often should we check the server? Duration interval; @override Function get fromJsonFunction => _$RemoteTriggerFromJson; factory RemoteTrigger.fromJson(Map<String, dynamic> json) => FromJsonFactory().fromJson<RemoteTrigger>(json); @override Map<String, dynamic> toJson() => _$RemoteTriggerToJson(this);}
Trigger configurations are part of a study protocol and hence needs to be serializable to/from JSON.
Each trigger needs a corresponding TriggerExecutor to execute the trigger on sampling runtime. An example of a RemoteTriggerExecutor executor is shown below. Every Executor in CAMS can implement runtime behavior on init, start, stop, restart, and dispose. The abstract class TriggerExecutor is a convenient class to use for the implementation of a trigger executor since it has default implementations of all methods. Hence, you only need to override the lifecycle methods you need something to happen. Typically — and as shown below — the most relevant method to override is the onStart() method which is called when sensing is started. In this method, you will implement the trigger logic. In the RemoteTriggerExecutor the trigger starts a periodic timer that regularly checks the resources specified by the URI in the trigger configuration. If there is a resource available, it triggers by calling the onTrigger() callback method.
/// Executes a [RemoteTrigger], i.e. check if there is a resource on/// the server and triggers if so.class RemoteTriggerExecutor extends TriggerExecutor<RemoteTrigger> { final client = Client(); @override Future<bool> onStart() async { // Set up a periodic timer to look for a resource at the specified URI timer = Timer.periodic(configuration!.interval, (_) async { var response = await client.get( Uri.parse(Uri.encodeFull(configuration!.uri)), ); if (response.statusCode == HttpStatus.ok) { // If there is a resource at the specified URI, then trigger this executor onTrigger(); } }); return true; }}
Note that this trigger executor is very simplistic and does not cover any edge cases, like exceptions from network errors.
The last step is to define a TriggerFactory that knows how to map triggers to their executors on runtime. An example of the RemoteTriggerFactory is shown below. Note that a factory can handle multiple triggers, as defined in the set of trigger types it supports. The main method of the factory is the create method that can create the correct trigger executor for the specified trigger.
/// A [TriggerFactory] for remote triggers.class RemoteTriggerFactory implements TriggerFactory { @override Set<Type> types = { // Note that this factory might support several types of remote triggers RemoteTrigger, }; @override void onRegister() { // When registering this factory add the triggers to the JSON serialization FromJsonFactory().registerAll([RemoteTrigger(uri: 'uri')]); } @override TriggerExecutor<TriggerConfiguration> create(TriggerConfiguration trigger) => switch (trigger) { RemoteTrigger _ => RemoteTriggerExecutor(), _ => ImmediateTriggerExecutor(), };}
Note that in the onRegister() callback function, the trigger configurations are added to the FromJsonFactory(). This allows trigger configurations defined in a study protocol to be serialized to/from JSON.
The last step is to register this trigger factory with CAMS. This is done by calling:
Data - specifies the data model of the measures collected.
Probe - implements the runtime of data collection.
DeviceManager - specifies how an external device is managed (if any).
Not all sampling packages use an external device and are hence simpler to implement since they can extend the default SmartphoneDeviceManager.
In the following, we will use the DeviceSamplingPackage as an example of a sampling package that does not use any external device.
The first step is that your new sampling package should implement the SamplingPackage interface:
class DeviceSamplingPackage extends SmartphoneSamplingPackage { /// Measure type for collection of basic device information static const String DEVICE_INFORMATION = '${CarpDataTypes.CARP_NAMESPACE}.deviceinformation'; /// Measure type for collection of free physical and virtual memory. static const String FREE_MEMORY = '${CarpDataTypes.CARP_NAMESPACE}.freememory'; /// Measure type for collection of battery level and charging status. static const String BATTERY_STATE = '${CarpDataTypes.CARP_NAMESPACE}.batterystate'; /// Measure type for collection of screen events (on/off/unlocked). static const String SCREEN_EVENT = '${CarpDataTypes.CARP_NAMESPACE}.screenevent'; /// Measure type for collection of the time zone of the device. static const String TIMEZONE = '${CarpDataTypes.CARP_NAMESPACE}.timezone'; @override DataTypeSamplingSchemeMap get samplingSchemes => DataTypeSamplingSchemeMap.from([ DataTypeSamplingScheme(CamsDataTypeMetaData( type: DEVICE_INFORMATION, displayName: "Device Information", timeType: DataTimeType.POINT, dataEventType: DataEventType.ONE_TIME, )), DataTypeSamplingScheme( CamsDataTypeMetaData( type: FREE_MEMORY, displayName: "Free Memory", timeType: DataTimeType.POINT, ), IntervalSamplingConfiguration( interval: const Duration(minutes: 1), )), DataTypeSamplingScheme(CamsDataTypeMetaData( type: BATTERY_STATE, displayName: "Battery State", timeType: DataTimeType.POINT, )), DataTypeSamplingScheme(CamsDataTypeMetaData( type: SCREEN_EVENT, displayName: "Screen Events", timeType: DataTimeType.POINT, )), DataTypeSamplingScheme(CamsDataTypeMetaData( type: TIMEZONE, displayName: "Device Timezone", timeType: DataTimeType.POINT, dataEventType: DataEventType.ONE_TIME, )), ]); @override void onRegister() { FromJsonFactory().registerAll([ DeviceInformation(), BatteryState(), FreeMemory(), ScreenEvent(), Timezone(''), ]); } @override Probe? create(String type) => switch (type) { DEVICE_INFORMATION => DeviceProbe(), FREE_MEMORY => MemoryProbe(), BATTERY_STATE => BatteryProbe(), TIMEZONE => TimezoneProbe(), SCREEN_EVENT => (Platform.isAndroid) ? ScreenProbe() : null, _ => null, };}
The measures supported by this package are listed as static strings, such as FREE_MEMORY defining the type dk.cachet.carp.freememory.
The DataTypeSamplingSchemeMap defines the configuration of each measure, by specifying the CamsDataTypeMetaData for each measure and its sampling configuration (if needed). The onRegister() method is called when the package is registered and in this case registers the data classes for JSON serialization. Finally, the create() method is called when a probe is to be created and returns the right probe based on the measure type.
A note on Permissions - the CamsDataTypeMetaData allows the specification of what OS-specific permissions are needed to collect a measure.In the measures used as an example above, no permissions are required. However, in the SensorSamplingPackage, the activityRecognition permission is required in order to collect step counts, as shown in the samplingSchemes of that package.
Default sampling configurations can be specified as part of samplingSchemes, like the IntervalSamplingConfiguration specified for the FREE_MEMORY measure type above.There are several built-in sampling configurations available:
Samples in periodic windows using both interval and duration settings.
But you can write your own sampling configuration tailored to a specific measure and hence how data should be collected in a probe. As an example of how to write a sampling configuration, the IntervalSamplingConfiguration is shown below:
/// A sampling configuration that allows configuring the time [interval] in/// between subsequent measurements.@JsonSerializable(includeIfNull: false, explicitToJson: true)class IntervalSamplingConfiguration extends PersistentSamplingConfiguration { /// Sampling interval (i.e., delay between sampling). Duration interval; IntervalSamplingConfiguration({required this.interval}) : super(); @override Function get fromJsonFunction => _$IntervalSamplingConfigurationFromJson; @override Map<String, dynamic> toJson() => _$IntervalSamplingConfigurationToJson(this); factory IntervalSamplingConfiguration.fromJson(Map<String, dynamic> json) => FromJsonFactory().fromJson(json) as IntervalSamplingConfiguration;}
Note that a sampling configuration must extend from one of the sampling configuration types (e.g., SamplingConfiguration or PersistentSamplingConfiguration).Sampling configurations can be part of a study protocol where you can override the sampling configuration of a measure.
The next step is to implement probes for each measure type. A probe collects the data and returns it as Measurement.The abstract Probe interface defines a probe and how to implement probes. But in order to create your own probes, CAMS has a set of predefined, abstract probes to extend from:
Buffers an underlying stream for a fixed period and then emits a measurement.
The DeviceProbe is an example of a simple MeasurementProbe that collects device information about the phone using the device_info_plus plugin and map this to a DeviceInformation data and returns a Measurement with this data.
As shown below, since the DeviceProbe extends MeasurementProbe it just need to implement the getMeasurement() method, which returns the device info mapped to a DeviceInformation data item.
/// A probe that collects the device info about this device.class DeviceProbe extends MeasurementProbe { @override Future<Measurement?> getMeasurement() async => Measurement.fromData(DeviceInformation( deviceData: DeviceInfo().deviceData, platform: DeviceInfo().platform, deviceId: DeviceInfo().deviceID, deviceName: DeviceInfo().deviceName, deviceModel: DeviceInfo().deviceModel, deviceManufacturer: DeviceInfo().deviceManufacturer, operatingSystem: DeviceInfo().operatingSystemName, hardware: DeviceInfo().hardware, ));}
The ScreenProbe is an example of a StreamProbe that collects screen activity data. This probe implements the stream property, which maps screen events from the screen_state plugin to ScreenEvent data objects, which again is wrapped in a measurement using the Measurement.fromData factory.
/// A probe collecting screen events:/// - SCREEN ON/// - SCREEN OFF/// - SCREEN UNLOCK/// which are stored as a [ScreenEvent].class ScreenProbe extends StreamProbe { Screen screen = Screen(); @override Stream<Measurement> get stream => screen.screenStateStream.map( (event) => Measurement.fromData(ScreenEvent.fromScreenStateEvent(event)), );}
If a sampling package handles (i.e., collects data from) an external device or service, then it should be able to specify what type of device it supports and provide corresponding DeviceManager and DeviceConfiguration classes.
For example, in the eSense sampling package, the ESenseSamplingPackage implements the following two methods:
class ESenseSamplingPackage implements SamplingPackage { final DeviceManager _deviceManager = ESenseDeviceManager(ESenseDevice.DEVICE_TYPE); // other parts of the package omitted @override List<Permission> get permissions => []; @override String get deviceType => ESenseDevice.DEVICE_TYPE; @override DeviceManager get deviceManager => _deviceManager;}
The ESenseDevice device configuration describes how an eSense device is to be configured:
/// A [DeviceConfiguration] for an eSense device used in a [StudyProtocol].@JsonSerializable(fieldRename: FieldRename.none, includeIfNull: false)class ESenseDevice extends BLEDevice<BLEDeviceRegistration> { /// The type of an eSense device. static const String DEVICE_TYPE = '${CamsDevice.CAMS_DEVICE_NAMESPACE}.ESenseDevice'; /// The default role name for an eSense device. static const String DEFAULT_ROLE_NAME = 'eSense'; /// The sampling rate in Hz of getting sensor data from the device. int samplingRate; ESenseDevice({ super.roleName = ESenseDevice.DEFAULT_ROLE_NAME, super.isOptional = true, this.samplingRate = 10, }); @override Function get fromJsonFunction => _$ESenseDeviceFromJson; factory ESenseDevice.fromJson(Map<String, dynamic> json) => FromJsonFactory().fromJson(json) as ESenseDevice; @override Map<String, dynamic> toJson() => _$ESenseDeviceToJson(this);}
Note that since a device configuration can be part of a study protocol, it needs to support JSON serialization.The device manager should implement the DeviceManager interface.
CAMS has a set of predefined device managers which can be extended, including the SmartphoneDeviceManager, ServiceManager, HardwareDeviceManager, and BLEDeviceManager.
Since the eSense sensor is a BLE device, the ESenseDeviceManager extends the BLEDeviceManager class.
/// A [DeviceManager] for the eSense device.////// Note that eSense use the [bleName] (and not the BLE address) for connecting to it./// Typically of the form `eSense-xxxx`.class ESenseDeviceManager extends BLEDeviceManager<ESenseDevice, BLEDeviceRegistration> { Timer? _batteryTimer; StreamSubscription<ESenseEvent>? _batterySubscription; double? _voltageLevel; final StreamController<int> _batteryEventController = StreamController.broadcast(); ESenseManager? _manager; /// The eSense device handler. /// Only available after [bleName] has been set. ESenseManager? get manager => bleName != null ? _manager ??= ESenseManager(bleName!) : _manager = null; @override String? get displayName => bleName; /// An estimate of the battery level based on the voltage level. @override int? get batteryLevel => (_voltageLevel != null) ? ((1.19 * _voltageLevel! - 3.91) * 100).toInt() : null; @override Stream<int> get batteryEvents => _batteryEventController.stream; ESenseDeviceManager(super.type, {super.configuration}); @override bool get canConnect => bleName != null; @override BLEDeviceRegistration createRegistration() => BLEDeviceRegistration( deviceDisplayName: bleName, isConnected: isConnected, batteryChargingState: batteryLevel != null ? HardwareDeviceRegistration.parseBatteryLevel(batteryLevel!) : BatteryChargingState.unknown, bleAddress: bleName ?? 'Unknown eSense Device', bleName: bleName, ); @override Future<DeviceStatus> onConnect() async { try { // listen for connection events manager?.connectionEvents.listen((event) async { switch (event.type) { case ConnectionType.connected: status = DeviceStatus.connected; await manager?.setSamplingRate(configuration?.samplingRate ?? 10); // when connected, listen for battery events _batterySubscription = manager!.eSenseEvents.listen((event) { if (event is BatteryRead) { _voltageLevel = event.voltage; if (batteryLevel != null) { _batteryEventController.add(batteryLevel!); } } }); // set up a timer that asks for the voltage level _batteryTimer = Timer.periodic(const Duration(minutes: 2), (_) { if (status == DeviceStatus.connected) { manager?.getBatteryVoltage(); } }); break; case ConnectionType.unknown: status = DeviceStatus.unknown; break; case ConnectionType.device_found: status = DeviceStatus.connecting; break; case ConnectionType.device_not_found: case ConnectionType.disconnected: status = DeviceStatus.disconnected; _voltageLevel = null; _batteryTimer?.cancel(); _batterySubscription?.cancel(); break; } }); // try to connect to the manager with the [bleName] manager?.connect(); } catch (error) { warning( '$runtimeType - Error connecting to eSense device: $bleName - $error', ); return DeviceStatus.disconnected; } return DeviceStatus.connecting; } @override Future<bool> onDisconnect() async => await manager?.disconnect() ?? false;}
Once the eSense device manager is in place, the eSense probes (button and IMU sensor) can be implemented as stream probes.
/// Collects eSense button pressed events. It generates an [ESenseButton]/// every time the button is pressed or released.class ESenseButtonProbe extends StreamProbe { @override ESenseDeviceManager get deviceManager => super.deviceManager as ESenseDeviceManager; @override Stream<Measurement>? get stream => (deviceManager.isConnected) ? deviceManager.manager!.eSenseEvents .where((event) => event.runtimeType == ButtonEventChanged) .map( (event) => Measurement.fromData( ESenseButton( deviceName: deviceManager.manager!.deviceName, pressed: (event as ButtonEventChanged).pressed, ), ), ) .asBroadcastStream() : null;}
Both the device manager and the probes make use of the ESenseManager from the esense_flutter Flutter Plugin.
A package can be released as a Dart package on Pub.
We already provide a list of different sampling packages - both using the onboard phone sensors as well as external wearable devices and online services.
CAMS comes with a set of built-in and external data managers - see Data Managers for an overview.It is possible to extend CAMS with support for new data managers that can save or upload data to custom data backends.
Support for this is done by implementing 3 interfaces:
1
Create a data endpoint
Implement the DataEndPoint interface which is used in the study protocol.
2
Create a data manager
Implement the DataManager interface which uploads or saves the measurement as they are sampled.
3
Create a data manager factory
Create a DataManagerFactory that can create a data manager based on the data endpoint configuration,
A data endpoint is included in the study protocol and specifies what data manager to use for storing or forwarding data, and the configuration of this data manager. Any new data manager should have a corresponding data endpoint that extends from the DataEndPoint class. For example, the FileDataEndPoint specifies that data should be saved to a file using the FileDataManager data manager. This FileDataEndPoint allows for configuring the data manager by specifying buffer size and whether the file should be zipped or encrypted.
A DataEndPoint must support JSON serialization in order to be part of a study protocol
A data manager implements the functionality for actually storing or forwarding the collected measurements.
Any new data manager should implement the DataManager interface:
/// The [DataManager] interface is used to upload [Measurement] objects to any/// data manager that implements this interface.abstract class DataManager { /// The deployment using this data manager. PrimaryDeviceDeployment get deployment; /// The ID of the study deployment that this manager is handling. String get studyDeploymentId; /// The type of this data manager as enumerated in [DataEndPointTypes]. String get type; /// Configure the data manager by specifying the study [deployment], the /// [dataEndPoint], and the stream of [measurements] events to handle. Future<void> configure({ required DataEndPoint dataEndPoint, required SmartphoneDeployment deployment, required Stream<Measurement> measurements, }); /// Flush any buffered data and close this data manager. /// After calling [close] the data manager can no longer be used. Future<void> close(); /// Stream of data manager events. Stream<DataManagerEvent> get events; /// On each measurement collected, the [onMeasurement] handler is called. /// /// Implementations of this interface should handle how to save /// or upload the [measurement]. Future<void> onMeasurement(Measurement measurement); /// When the data stream closes, the [onDone] handler is called. Future<void> onDone(); /// When an error event is send on the stream, the [onError] handler is called. Future<void> onError(Object error);}
All of these methods have to be implemented. However, the AbstractDataManager class provides a useful class to start from.
For example, the ConsoleDataManager provides a very simple example of a data manager that prints a json encoded version of the data to the console:
/// A very simple data manager that just "uploads" the data to the/// console (i.e., prints it). Used mainly for testing and debugging purposes.class ConsoleDataManager extends AbstractDataManager { @override String get type => DataEndPointTypes.PRINT; @override Future<void> onMeasurement(Measurement measurement) async => debugPrint(jsonEncode(measurement));}
When creating a new data manager, a corresponding DataManagerFactory must be provided. This factory knows how to create a data manager of a specific type when the study protocol is loaded. The interface looks like this:
/// A factory which can create a [DataManager] based on the `type` of an/// [DataEndPoint].abstract class DataManagerFactory { /// The [DataEndPoint] type. String get type; /// Create a [DataManager]. DataManager create();}
The factory for the ConsoleDataManager looks like this:
class ConsoleDataManagerFactory implements DataManagerFactory { @override String get type => DataEndPointTypes.PRINT; @override DataManager create() => ConsoleDataManager();}
For a data manager to be used in CAMS, its factory must be registered in the DataManagerRegistry singleton like this:
Data transformation is supported by the DataTransformerSchema
class and can be implemented by implementing the namespace getter and the onRegister() callback function.
As an example, the implementation of the Open mHealth transformer schema is shown below:
/// A default [DataTransformerSchema] for Open mHealth (OMH) transformersclass OMHTransformerSchema extends DataTransformerSchema { @override String get namespace => NameSpace.OMH; @override void onRegister() {}}
Each transformer schema must be registered in the DataTransformerSchemaRegistry (which is a singleton).
Hence, add the following line to your setup up part of the app:
Once the schema is registered, transformers for each data type can be created.
Data transformation is a transformation of one type of data to another type of data. I.e. data transformation is defined by the DataTransformer typedef.
typedef DataTransformer = Data Function(Data);
For each Data you want to transform, you need to define a new class that also extends Data.
For example the following OMHGeopositionDataPoint class represents an OMH Geoposition data point.
The most important function to implement is the transformer, and to make sure JSON serialization is supported (in the example above, this is handled by the superclass OMHContextDataPoint). The transformer is a function that can transform one type of Data to another. The mapping between the two data types happens in the fromLocationData factory. The toJson method is needed in order to serialize and store the data. Also, note that the namespace of the DataFormat of this OMHGeopositionDataPoint is OMH.Once the mapper Data class is created, it must be added to the schema. This is done by calling;
Data transformation can be done “manually” by looking up a specific transformer schema and applying its transformer. For example, the following code transforms a Location into an OMHGeopositionDatum.
However, a more general and common use of transformers is to specify what data format the DataEndPoint should use.
This is done via the dataFormat property, which specifies the namespace, like NameSpace.OMH.The following data endpoint saves measurements to a file using the open mHealth (OMH) data format:
// Create a study protocol storing data in files using the OMH data format var protocol = SmartphoneStudyProtocol( ownerId: 'AB', name: 'Track patient movement', dataEndPoint: FileDataEndPoint( bufferSize: 500 * 1000, zip: true, encrypt: false, dataFormat: NameSpace.OMH, ), );
Note that there is no OMH data format for all data types in CAMS. In the case that no OMH transformer is found, the data is stored in the original CARP data format.
A special instance of a data transformer schema is the PrivacySchema, which basically takes a piece of data and protects relevant properties.
CAMS comes with a built-in schema with a transformer namespace specified in PrivacySchema.DEFAULT.
Hence, to add privacy protection, two things have to be implemented:
A transformer function that knows how to transform a Data object so that its properties are privacy-protected.
Below are examples of privacy functions that anonymize text messages and phone calls:
/// A [TextMessage] anonymizer function. Anonymizes:/// - address/// - bodyTextMessage textMessageAnonymizer(Data data) { assert(data is TextMessage); var msg = data as TextMessage; if (msg.address != null) { msg.address = sha1.convert(utf8.encode(msg.address!)).toString(); } if (msg.body != null) { msg.body = sha1.convert(utf8.encode(msg.body!)).toString(); } return msg;}
This function can be put anywhere in the sampling package, but a typical place to put it is in a separate file (if you have many functions) or in the package dart file (if you only have a few).Registering the anonymizer functions in the PrivacySchema.DEFAULT schema:
This is typically done as part of the onRegister() callback in the package.
The example above is implemented as part of the carp_communication_package package.
It is good practice to consider if your data sampling package needs to supply privacy-protecting functions, and if so, add these to the default privacy schema.