Meta collects real workflow signals to train next-generation agents
Meta has begun installing a tool called the Model Capability Initiative (MCI) on US-based employee computers to capture how people actually interact with workplace apps and websites. According to reporting, MCI logs mouse movements, clicks, keystrokes and occasional screenshots within work-related contexts. That kind of granular, real-world interaction data is precisely what helps AI agents learn to operate software the way humans do.
Training agents on real interactions improves usefulness
By training models on genuine behavioral signals, Meta can build agents that are better at completing multi-step workflows, predicting next actions, and automating repetitive tasks. In practical terms, that can translate into faster task completion, fewer manual errors, and more reliable assistant behavior in enterprise and consumer tools. Meta has stated the data won’t be used for performance evaluations, framing the initiative as capability-focused rather than surveillance-driven.
Benefits and governance
The immediate win is tangible: real interaction logs shorten the gap between lab-trained models and real-world needs, enabling more capable, context-aware automation. Meta’s move is a real-world deployment that could accelerate useful agent features across its product suite. At the same time, the initiative highlights the importance of clear transparency, opt-in/consent mechanisms, and data minimization to ensure employee trust as agents become more capable.
- Practical agent improvements: smoother automation of routine tasks and improved UI navigation.
- Faster iteration: real data enables quicker model refinement and better user experiences.
- Governance matters: transparency and safeguards will be key to broad acceptance and ethical deployment.