Microsoft Vibing — capturing screenshots and voice samples without governance

5 min read Original article ↗

Kevin Beaumont

An interesting executable caught my eye on endpoints recently — Vibing.exe

It was delivered by Microsoft Store, and claims to be “your interface to the AI-native world”:

Published by “Vibing-Team”, the executable flags behaviour for capturing the user’s screen and contents, the clipboard, the user’s microphone, and sending traffic to an Azure interface. The terms of sale link to Microsoft.

Vibing-Team themselves have no website, and there’s no clue online as to who Vibing-Team are.

The software does not tell the user it is sending data to Azure — there’s no consent or prompt in app.

Demos posted on the Vibing website show the software writing Microsoft Words documents, working in Microsoft Teams, and coding in Visual Studio:

Digging on Microsoft’s store, there is a privacy policy for Vibing, which isn’t referenced in app. This policy incorrectly states the user configures the API servers, but these are actually preconfigured to a Microsoft Azure endpoint — which is indeed a third party service:

The pre-configuration:

Press enter or click to view image in full size

So, what is Vibing, and who owns it?

The software itself can be downloaded the Vibing website or Microsoft Store.

The software itself is in Vibing.exe, which is digitally signed by Yaoyao Chang using an SSL.com co-signer:

Press enter or click to view image in full size

Yaoyao works for the Microsoft GenAI research labs in Beijing.

The entry describes Vibing as “built by the community”, and was added by MSJwyv, another Microsoft employee in Beijing in the AI research centre. The change describes the adoption as “open-source”:

However the Vibing Github repo is not open source — it fact, it contains no source code at all, just a 80mb binary, Vibing.exe.

Strangely, Yaoyao comments on different Github issues and appears to suggest they are not involved, e.g.:

…despite the fact they have digitally signed the binary and published the app.

The Vibing Github repo uses the same logo design as the Microsoft VibeVoice product, and has just three contributors:

These accounts were created just before Microsoft added the repo to their VibeVoice page.

The installation instructions appear to feature screenshots from a Microsoft corporate device, and using OSINT tools I’ve been able to determine the Azure Front Door endpoint it sends user data to is in a Microsoft corporate owned Azure tenant.

By “community”, Microsoft’s Research Asia mean Microsoft — they’ve just skipped their own governance and compliance steps around getting security, privacy and AI reviews by pretending it’s an open source project.

What does Vibing do?

Vibing is available for Windows and Mac. I’ve only looked at the Windows version to date.

Vibing sets itself to auto start on Windows login, and when in use hijacks the clipboard to copy content, and takes screenshots of the users PC — these screens are then run through base64, and sent to the Azure Front Door endpoint along with the unique, per machine hardware GUID. This allows screenshots to be identified per machine. It is unclear why Microsoft would want to link screenshots to systems.

Along with the screenshots, it sends certain words from windows, window titles/app names, and content over WebSocket. WebSocket can avoid some proxy blocking configurations. It also gathers auto hotwords locally and then submits words when discovered in the WebSocket requests.

Get Kevin Beaumont’s stories in your inbox

Join Medium for free to get updates from this writer.

Remember me for faster sign in

It also records audio using the microphone, and uploads this raw audio to Azure — also with GUID identifiers.

Security concerns

I’ve identified a number of cybersecurity concerns which need further investigation and possible responsible disclosure. The software is large and has a rich attack surface.

Press enter or click to view image in full size

The software itself appears to be vibe coded, and is held together by string. You can grab it yourself for research here.

Privacy concerns

  • There’s no indication in the software it is sending data remotely
  • Vibing’s privacy policy on Microsoft Store says it doesn’t send data to third parties but it absolutely does out of the box
  • There’s no named author online anywhere for Vibing-Team or any data controller listed, with the owners within Microsoft attempting to disguise themselves
  • It sends a unique GUID for every keystroke and screenshot which allows data to be tracked over time, and this isn’t disclosed anywhere. There’s no reason at all for this to be collected.
  • There’s no way this has been through a proper privacy review at Microsoft prior to release (or security review)
  • It is unclear what privacy oversight this has at Microsoft in terms of operations — e.g. who is looking at the gathered data? Is it really being securely removed? Is the Microsoft AI team in China supposed to be doing this on their own?

What is Microsoft doing about this?

Multiple concerns have been raised by developers with Microsoft about Vibing, including tagging those at Microsoft who linked and published the project.

For example, here concerns are raised — which MSFT’s Yaoyao Chang closes without comment and takes no action:

MSFT’s MSJwyv is also tagged with concerns… who doesn’t comment or take action.

You may know Microsoft VibeVoice from their prior incident:

This has been going on for weeks now.

IoCs

  • vibing.exe
  • Vibing Installer.exe

vibing-api-ccegdhbrg2d6bsd7.b02.azurefd.net

Updates

Friday 24th April 2026 – 1am –

1pm – Microsoft have removed the Vibing downloads and shut down the service, pending a compliance review:

Saturday 25th April — 11am

Microsoft are attempting to hide the compliance review of Microsoft Vibing:

Monday 27th April – 1pm

Somebody is attempting to hide the commit with Yaoyao’s name on, which tries to hide the compliance review of Microsoft Vibing:

Wednesday 29th April – 5pm

Journalist Dan Goodin asked Microsoft about this. They’ve confirmed it is a Microsoft research project. They say “We have removed the application as we review its functionality and adherence to our policies. We remain committed to responsible AI and are taking appropriate steps as part of this review.”

It is unclear what happened to the screenshots and audio collected, and why Microsoft hadn’t disclosed ownership of the service and why, prior this to blog, staff were pretending it was an open source project that they were not involved with.