Back to Blog
6 min read

Microsoft Copilot Updates: What's New for Enterprise Users

Microsoft Copilot continues to evolve rapidly across the Microsoft ecosystem. Let’s explore the latest updates and how they impact enterprise users.

Copilot Ecosystem Overview

┌─────────────────────────────────────────────────────────────┐
│                   Microsoft Copilot                          │
├─────────────────┬─────────────────┬─────────────────────────┤
│ Copilot for M365│ Copilot Studio  │ GitHub Copilot          │
├─────────────────┼─────────────────┼─────────────────────────┤
│ - Word          │ - Custom agents │ - Code completion       │
│ - Excel         │ - Connectors    │ - Chat                  │
│ - PowerPoint    │ - Actions       │ - PR reviews            │
│ - Outlook       │ - Workflows     │ - Documentation         │
│ - Teams         │                 │                         │
└─────────────────┴─────────────────┴─────────────────────────┘

Copilot for Microsoft 365 Updates

Excel Copilot Enhancements

# Excel Copilot can now handle complex data analysis
# Example: Natural language to Python in Excel

# User prompt: "Analyze sales trends and forecast next quarter"
# Copilot generates:

import pandas as pd
import numpy as np
from sklearn.linear_model import LinearRegression
from datetime import timedelta

# Read data from Excel range
df = xl("Sales[#All]", headers=True)

# Prepare time series data
df['Date'] = pd.to_datetime(df['Date'])
df['DayNumber'] = (df['Date'] - df['Date'].min()).dt.days

# Train simple forecast model
X = df[['DayNumber']].values
y = df['Revenue'].values

model = LinearRegression()
model.fit(X, y)

# Forecast next 90 days
last_day = df['DayNumber'].max()
future_days = np.array([[last_day + i] for i in range(1, 91)])
forecast = model.predict(future_days)

# Create forecast dataframe
forecast_dates = pd.date_range(
    start=df['Date'].max() + timedelta(days=1),
    periods=90
)

result = pd.DataFrame({
    'Date': forecast_dates,
    'Forecast': forecast
})

# Return to Excel
result

PowerPoint Copilot for Data Stories

# New capability: Create presentations from data

User: "Create a presentation about our Q3 sales performance
       using data from the Sales lakehouse"

Copilot:
1. Connects to Microsoft Fabric
2. Queries sales data
3. Generates insights
4. Creates slides with:
   - Executive summary
   - Key metrics visualization
   - Regional breakdown
   - Trend analysis
   - Recommendations

Teams Copilot Intelligent Recap

# Teams meeting intelligence API

from msgraph import GraphServiceClient

async def get_meeting_insights(meeting_id: str):
    """Get AI-generated meeting insights."""

    client = GraphServiceClient(credentials)

    # Get transcript
    transcript = await client.communications.calls[meeting_id].transcript.get()

    # Get AI insights
    insights = await client.communications.calls[meeting_id].insights.get()

    return {
        "summary": insights.summary,
        "action_items": insights.action_items,
        "decisions": insights.decisions,
        "topics": insights.topics,
        "sentiment": insights.sentiment_analysis,
        "follow_ups": insights.suggested_follow_ups
    }

# Example output
{
    "summary": "Discussed Q4 planning and budget allocation...",
    "action_items": [
        {"owner": "John", "task": "Prepare budget proposal", "due": "Nov 20"},
        {"owner": "Sarah", "task": "Review vendor contracts", "due": "Nov 18"}
    ],
    "decisions": [
        "Approved 15% increase for cloud infrastructure",
        "Delayed new hire until Q1"
    ],
    "topics": ["Budget", "Hiring", "Infrastructure"],
    "sentiment": {"overall": "positive", "concerns": ["timeline"]}
}

Copilot Studio Updates

Custom Copilot Creation

# copilot-config.yaml
name: FabricDataAssistant
description: Help users with Microsoft Fabric data tasks

model:
  provider: azure_openai
  deployment: gpt-4o

knowledge_sources:
  - type: sharepoint
    site: "https://contoso.sharepoint.com/sites/DataTeam"
    folders: ["Documentation", "Best Practices"]

  - type: fabric_lakehouse
    workspace: "DataPlatform"
    lakehouse: "Analytics"

  - type: website
    url: "https://learn.microsoft.com/fabric"

actions:
  - name: query_fabric
    description: Execute queries against Fabric
    connector: fabric_connector
    requires_approval: false

  - name: create_report
    description: Create a Power BI report
    connector: powerbi_connector
    requires_approval: true

conversation:
  greeting: "Hi! I'm your Fabric Data Assistant. I can help you query data, create reports, and answer questions about our data platform."
  fallback: "I'm not sure about that. Would you like me to search our documentation or connect you with the data team?"

security:
  authentication: azure_ad
  allowed_groups: ["Data Users", "Analysts"]

Copilot Actions

# Copilot Studio actions are defined using Power Platform connectors and flows
# For Fabric integration, use the Fabric REST API via custom connectors

# Example: Power Automate flow action for Fabric queries
# This would be configured in Power Automate and called from Copilot Studio

from azure.identity import DefaultAzureCredential
import requests

def fabric_query_action(query: str, lakehouse: str, max_rows: int = 100):
    """Execute queries against Microsoft Fabric Lakehouse via REST API."""

    credential = DefaultAzureCredential()
    token = credential.get_token("https://api.fabric.microsoft.com/.default").token

    headers = {
        "Authorization": f"Bearer {token}",
        "Content-Type": "application/json"
    }

    workspace_id = "your-workspace-id"
    lakehouse_id = "your-lakehouse-id"

    # Use SQL endpoint for queries
    sql_endpoint = f"https://<workspace>.datawarehouse.fabric.microsoft.com"

    # Execute query via SQL endpoint
    safe_query = f"SELECT TOP {max_rows} * FROM ({query}) AS subquery"

    # Note: For production, use the Fabric SQL Connection String
    # with pyodbc or another SQL driver

    return {
        "query": safe_query,
        "lakehouse": lakehouse,
        "max_rows": max_rows
    }

# In Copilot Studio:
# 1. Create a custom connector pointing to your Fabric API wrapper
# 2. Define the action with input parameters (query, lakehouse, max_rows)
# 3. Configure authentication using Azure AD
# 4. Use the action in your copilot topics

Copilot Connectors

# Custom connector definition for Copilot Studio (OpenAPI format)
# Save as fabric-connector.yaml and import into Power Platform

swagger: "2.0"
info:
  title: Microsoft Fabric Connector
  description: Connect to Microsoft Fabric workspaces
  version: "1.0"
host: api.fabric.microsoft.com
basePath: /v1
schemes:
  - https
securityDefinitions:
  oauth2_auth:
    type: oauth2
    flow: accessCode
    authorizationUrl: https://login.microsoftonline.com/common/oauth2/authorize
    tokenUrl: https://login.microsoftonline.com/common/oauth2/token
    scopes:
      https://api.fabric.microsoft.com/.default: Access Fabric API
paths:
  /workspaces:
    get:
      summary: List workspaces
      operationId: ListWorkspaces
      responses:
        200:
          description: Success
          schema:
            type: object
            properties:
              value:
                type: array
                items:
                  type: object
  /workspaces/{workspaceId}/items:
    get:
      summary: List items in workspace
      operationId: ListItems
      parameters:
        - name: workspaceId
          in: path
          required: true
          type: string
      responses:
        200:
          description: Success

# In Copilot Studio:
# 1. Go to Connectors > Custom connectors
# 2. Import this OpenAPI definition
# 3. Configure OAuth2 authentication with Azure AD
# 4. Test the connector
# 5. Use in your copilot actions

Integration with Fabric

# Fabric Copilot integration example

class FabricCopilotIntegration:
    """Integrate Copilot with Microsoft Fabric."""

    async def natural_language_query(self, question: str) -> dict:
        """Convert natural language to Fabric query."""

        # Get schema context
        schema = await self.get_lakehouse_schema()

        # Generate SQL using Copilot
        prompt = f"""Based on this schema:
{schema}

Convert this question to SQL:
{question}

Return only the SQL query."""

        response = await self.copilot.generate(prompt)
        sql = response.content

        # Execute and return results
        results = await self.execute_query(sql)

        return {
            "question": question,
            "sql": sql,
            "results": results,
            "explanation": await self.explain_results(question, results)
        }

    async def create_report_from_description(self, description: str) -> str:
        """Create Power BI report from natural language description."""

        # Parse description
        report_spec = await self.copilot.parse_report_description(description)

        # Generate DAX measures
        measures = await self.generate_dax_measures(report_spec)

        # Create report
        report = await self.powerbi.create_report(
            name=report_spec.name,
            dataset=report_spec.dataset,
            pages=report_spec.pages,
            measures=measures
        )

        return report.url

Microsoft Copilot is becoming the unified AI interface across the Microsoft ecosystem. Understanding these capabilities helps you leverage AI assistants effectively in your enterprise workflows.

Resources

Michael John Peña

Michael John Peña

Senior Data Engineer based in Sydney. Writing about data, cloud, and technology.