5 min read
Data Activator Preview: What to Expect and How to Prepare
Data Activator is in public preview as part of Microsoft Fabric. Understanding its current capabilities and limitations helps you plan for production adoption.
Preview Status and Availability
As of January 2024, Data Activator is available in:
- Public preview in all Fabric regions
- Requires Fabric capacity (F64 or higher recommended)
- Free to use during preview (consumes capacity units)
Current Capabilities
Supported Data Sources
supported_sources = {
"real_time": [
"Eventstreams",
"Power BI streaming datasets",
"Real-Time Hub events"
],
"batch": [
"Power BI semantic models",
"KQL databases (via queries)"
]
}
Supported Actions
supported_actions = {
"notifications": [
"Email (via Microsoft 365)",
"Microsoft Teams (webhook)",
"Power Automate (HTTP trigger)"
],
"coming_soon": [
"Direct webhook calls",
"Azure Functions",
"Logic Apps native"
]
}
Setting Up for Preview
Workspace Configuration
# Enable Data Activator in workspace settings
workspace_settings = {
"dataActivator": {
"enabled": True,
"defaultCapacity": "F64",
"alertingQuota": {
"maxTriggersPerWorkspace": 100,
"maxActionsPerMinute": 60
}
}
}
Creating Your First Trigger
# Using the Fabric REST API (preview)
import requests
def create_data_activator_item(
workspace_id: str,
name: str,
token: str
) -> dict:
"""Create a new Data Activator item."""
url = f"https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/dataActivators"
payload = {
"displayName": name,
"description": "Preview Data Activator for monitoring"
}
response = requests.post(
url,
headers={"Authorization": f"Bearer {token}"},
json=payload
)
return response.json()
Preview Limitations
Known Constraints
preview_limitations = {
"scale": {
"max_triggers_per_item": 50,
"max_objects_per_item": 100,
"max_properties_per_object": 20,
"max_evaluations_per_minute": 1000
},
"features_not_available": [
"Custom code in triggers",
"Complex aggregation windows (> 1 hour)",
"Cross-workspace data sources",
"Trigger templates",
"Bulk import/export"
],
"reliability": {
"sla": "No SLA during preview",
"data_retention": "7 days for trigger history",
"backup": "No automated backup"
}
}
Workarounds
# Workaround: Complex conditions via KQL
# Instead of complex Data Activator conditions, pre-process in KQL
kql_query = """
// Create a materialized view with complex logic
.create-or-alter materialized-view AlertConditions on table Telemetry
{
Telemetry
| where Timestamp > ago(1h)
| summarize
AvgTemp = avg(Temperature),
MaxTemp = max(Temperature),
TempSpike = maxif(Temperature, Temperature > percentile(Temperature, 95))
by DeviceId, bin(Timestamp, 5m)
| extend AlertLevel = case(
TempSpike > 0 and MaxTemp > 80, "Critical",
MaxTemp > 70, "Warning",
"Normal"
)
| where AlertLevel != "Normal"
}
"""
# Then use simple Data Activator trigger on AlertConditions view
Preparing for GA
Design Principles
ga_preparation_checklist = {
"architecture": [
"Design for scale (10x current volume)",
"Plan trigger hierarchy (objects -> triggers -> actions)",
"Document escalation patterns",
"Define ownership and responsibilities"
],
"governance": [
"Establish naming conventions",
"Create approval workflow for new triggers",
"Define alert severity levels",
"Plan for audit and compliance"
],
"operations": [
"Set up monitoring for Data Activator itself",
"Create runbooks for common alerts",
"Train operations team",
"Establish alert review cadence"
]
}
Migration Strategy
# Prepare for migrating from preview to GA
def document_current_triggers(workspace_id: str) -> list:
"""Export current trigger configurations for migration."""
triggers = get_all_triggers(workspace_id)
documented = []
for trigger in triggers:
documented.append({
"name": trigger["name"],
"object": trigger["object"],
"condition": trigger["condition"],
"actions": trigger["actions"],
"created": trigger["created"],
"last_modified": trigger["lastModified"],
"trigger_count_30d": get_trigger_count(trigger["id"], days=30)
})
return documented
# Export to JSON for backup
import json
triggers = document_current_triggers(workspace_id)
with open("data_activator_backup.json", "w") as f:
json.dump(triggers, f, indent=2)
Best Practices for Preview
1. Start Small
# Begin with non-critical alerts
pilot_use_cases = [
"Development environment monitoring",
"Non-production data quality alerts",
"Internal team notifications",
"Experimental dashboards"
]
# Avoid for now
avoid_in_preview = [
"Production SLA alerts",
"Customer-facing notifications",
"Compliance-related triggers",
"High-frequency trading signals"
]
2. Build Redundancy
# Don't rely solely on Data Activator for critical alerts
redundant_alerting = {
"primary": "Data Activator",
"backup": "Azure Monitor alerts",
"fallback": "Custom Logic Apps"
}
# Example: Critical temperature monitoring
critical_alert_pattern = """
1. Data Activator monitors Eventstream
2. Azure Monitor queries KQL database every 5 min
3. If Data Activator fails, Azure Monitor catches issues
4. Both send to same Teams channel with source tag
"""
3. Monitor the Monitor
# Track Data Activator health
def check_data_activator_health(workspace_id: str) -> dict:
"""Check health of Data Activator triggers."""
triggers = get_all_triggers(workspace_id)
health_report = {
"total_triggers": len(triggers),
"active_triggers": 0,
"failed_triggers": 0,
"issues": []
}
for trigger in triggers:
status = get_trigger_status(trigger["id"])
if status["state"] == "Active":
health_report["active_triggers"] += 1
elif status["state"] == "Failed":
health_report["failed_triggers"] += 1
health_report["issues"].append({
"trigger": trigger["name"],
"error": status["error"]
})
return health_report
Roadmap Expectations
Based on Microsoft announcements and community signals:
| Feature | Expected Timeline |
|---|---|
| Webhook actions | Q1 2024 |
| Azure Functions integration | Q2 2024 |
| Trigger templates | Q2 2024 |
| Cross-workspace sources | Q2 2024 |
| GA release | H2 2024 |
Conclusion
Data Activator preview offers powerful event-driven capabilities but requires careful planning:
- Use for non-critical scenarios during preview
- Build redundancy for important alerts
- Document everything for GA migration
- Provide feedback to shape the product
- Plan for scale even if starting small
The preview is an excellent time to learn and experiment. Build your expertise now to be ready when GA arrives.