Facebook and Microsoft Partner on Remote Development
developers.facebook.comJust an FYI for people - The Remote Development extensions are not open source. I'd hope if Facebook were joining efforts, they'd do so on a more open project.
1: https://code.visualstudio.com/docs/remote/faq#_why-arent-the...
2: https://github.com/microsoft/vscode/wiki/Differences-between...
3: https://github.com/VSCodium/vscodium/issues/240 (aka, on-the-wire DRM to make sure the remote components only talk to a licensed VS Code build from Microsoft)
MS edited the licensing terms many moons ago, to prepare for VSO (Visual Studio Online = VS Code in browser using these remote extensions/apis that no one else can use)- https://github.com/microsoft/vscode/issues/48279
Finally, this is the thread where you will see regular users being negatively impacted by the DRM (a closed source, non-statically linked proprietary binary downloaded at runtime) that implements this proprietary-ness: https://github.com/microsoft/vscode-remote-release/issues/10... (of course, also with enough details to potentially patch around this issue if you were so inclined). Further, MS acknowledged that statically linking would help in May, and yet it appears to still be an issue.
I just hope they don't come after Eclipse Theia...
Fuck. I really bought into MS's open VSCode spiel.
It's the Open Core business model that is becoming the standard Open Source business model these days. Wikipedia even has a nice little list of companies that use the same business model: https://en.wikipedia.org/wiki/Open-core_model
It's not EEE. The open core makes it so that the last step in EEE doesn't work. With an open core, anyone can fork.
Very happy about this move. Nuclide’s remote development capabilities were way above anything else I’ve tried (Sublime, IntelliJ Ultimate, VSCode, remote SSH mounts, etc).
It’s the only solution I’ve found that really allows you to browse the remote filesystem as smoothly as you would with your local drive (including when you’re also changing the remote files outside the IDE), degrade functionality as needed when the connection isn’t great (using caching appropriately), and immediately recover when it comes back. The only cost to pay was a bit of setup server-side (installing watchman and opening a port, if I remember correctly). I really hope they can bring the VSCode experience to the same level!
What is missing from vscode for remote development? MS released their remote development extensions earlier this year, and everything from browsing to searching feels native.
I haven't personally tried it yet, but the positive reviews made me curious - I'll have to check it out!
For reference, here's the documentation page for VS Code Remote Development:
https://code.visualstudio.com/docs/remote/remote-overview
The Remote Development Extension Pack (linked in the article):
https://marketplace.visualstudio.com/items?itemName=ms-vscod...
Do you happen to know if it works for vagrant VM's? Because having a file system mismatch (symlinked dependencies in vm, IDE doesn't recognise them) is a bit of a pain point for me right now, it would be really fantastic to find a solution within VSC.
Not sure, but it works by storing files on the remote server under ~/.vscode, and this is where all the normal code files are stored. When you connect to a remote system it loads these files over ssh.
I've been using a Vagrant VM and VSC remote via SSH for the last two months, and it works really, really well.
If you can SSH to your vagrant VM, then yes!
Looks pretty promising actually :)
Yeah, same rxn. I tried vsc's remote dev stuff to try pair-programming w a friend in another state, and was surprised and delighted at how well it worked. It was effortless and frictionless.
I've had a similar experience. Nothing but good things to say about it, the extension works incredibly well.
Really? I'll give it a try. When vscode was first released this was the one feature missing.
I tried it on the first day it was released and haven't looked back. It's made me significantly more productive than with the old ssh plug-ins.
+1 for VSCode.... I'm still waiting for the catch.
Indeed, I remember trying to switch to IntelliJ locally (with SSHfs on the server), a couple of times and always going back to Nuclide because of the lag with sshfs, whilst working there. Nuclide had a lot of problems, but good support for remote dev wasn’t one of them.
From what I've used internally, the remote development bits are already better than they were under Nuclide. Adding and managing multiple remote repos/working copies is a breeze.
Dired/Emacs has already had everything you're talking about for a long long time. There is even support for remote "inferior shells" in Emacs - notably for python.
I was an Atom holdout for a while - it felt disingenuous the way Microsoft kind of swept in and usurped their idea.
But man, eventually I caved. VSCode is a marvel. It performs better than most native IDEs I've used, despite being Electron-based. It's the good parts of Visual Studio without any of the legacy baggage. Its package ecosystem is just as vibrant as Atom's, with solid support for nearly every language under the sun, but at the same time its built-in features and attention to detail in the user experience are Apple-level (really, Apple-of-ten-years-ago-level). I still can't believe I get to use it for free.
I would say Microsoft's dev tools have always been best in the business, but only in their ecosystem and only for their ecosystem. A big part of the reason I still go out of my way to use MS SQL is how good SQL Studio is.
The revolution with VSCode is the openness, not the quality.
Out of interest, have you ever used Jetbrains products like IntelliJ IDEA and friends? Even back when I was writing C#, years ago, Visual Studio was way less productive without their ReSharper plugin.
I'm sure it's a "too each their own" feeling.
Personally, I've never seen Visual Studio so unproductive as with ReSharper installed. The number of colleagues over the years I've had to work to convince them to remove or disable ReSharper to get any work done on projects has been too many. ("I hate Visual Studio because it is too slow." "Do you have ReSharper installed? What happens when you disable it?" "Wow, Visual Studio is really fast now." Surprise.) I've had employers install it by default, I didn't like using it, and I made sure that my employer wasn't paying for a license directly for me when I uninstalled it.
If Android Studio is any indication, I don't see what the fuss is about IntelliJ either. But some of that is certainly just lack of familiarity because I only open up Android Studio when I have to.
It may just be that my experience was a good while back and VS has improved since then. At the time the built-in static analysis really wasn't up to scratch.
I’ve been happy doing most tasks in MySQL Workbench that I can do with SSMS.
Oddly, when using SQL Azure instead of on-prem SQL Server, SSMS doesn’t let you use its friendly hand-holding dialogs but instead drops you into a new editor document with cryptic syntax.
The only excuse I can think of is that the user-friendly popups are single-threaded and block window-messages on network IO and would have a very poor UX due to due the chatty TDS protocol.
It does sound like SQL Server and SSMS have some very interesting choices in remoting models, based on what I've read of Azure Data Studio's development.
Azure Data Studio's the VS Code-based SSMS-like tool focused on Azure SQL and Cosmos DB. At one point the development blog was talking about it as if Azure Data Studio might some day be the eventual replacement for even SSMS itself, but they seem to have walked that back, at least in part due to how many crazy things SSMS does and how SSMS is one of those tools that certain types of users would possibly go into some sort of costly meltdown if their cheese moved.
> usurped their idea.
Monaco was a thing way before Atom was. It was publicly released after Atom but it was already in some of Microsoft's websites.
That's interesting to know; perhaps it was just the right time for independent discovery.
Still, a web-based code editor on its own has independent utility from an IDE. I'm not sure they would've taken the plunge into a full-on Electron IDE without Atom first showing that there was demand for one.
Monaco was already embedded in the IE 10+ Dev Tools before Atom, and Browser Dev Tools are IDEs in almost every respect other than being bundled independently from a Browser, and even that has changed in some recent respects (you can run Chromium's Dev Tools as an independent app).
(In an interesting full circle, the Elements for Edge plugin to VS Code makes VS Code a full DOM browser for Chromium-based Edge.)
I don't think Atom was first with that idea, for example Brackets was in development before Atoms first release.
Monaco here being the code editor VSCode uses, not the Apple font or microstate.
For me, the tipping point were some Atom bugs that remained open for a long time[1], when I decided I didn't want to deal with those any longer, I went back to Sublime, which is what I was using before - so from my point of view, it is Atom itself that created its own demise. Of course I'm not denying that the competition from VS Code played a part, but at the same time I think that the Atom community could have addressed some UX issues a bit more promptly and maybe Atom would have had a better chance.
[1] in fact I've just checked and at least one of those bugs - #10720 - is still open, almost 4 years after it was created.
For me the tipping point was when installing Atom's Flow extension just started tanking performance. Atom had this super open write-whatever-in-JavaScript extension model which led to tons of great options, but it also meant that lots of common performance-intensive things didn't have a proper, coordinated, prescribed channel and were always talking over each other.
Kind of like how Android apps can just do whatever they want, and how that's become a major performance challenge for Google to mitigate. Whereas iOS says "here's how you send push notifications, here's how you do X Y and Z in the background, here's how you do web views, work within these APIs". Those constraints allow the platform to schedule and prioritize things, reuse work, and enforce quality. I don't actually know firsthand that VSCode enforces this kind of model, but I don't see how else they could get the performance they do with arbitrary extensions written in JavaScript.
I teach underprivileged adults how to code. Most of them use old computers so I had them ssh into my server when they code so they don't have to install anything. When they learn about web development, whatever they built would be immediately available online under their subdomains that they could show off immediately to their friends and try on their phones.
The only drawback is that they had to learn and use vim to do their code editing. I tried to make their lives easier by setting every user with a default vim config, but the insert vs normal mode is a big hurdle for most students.
I'm glad visual studio code supports remote development, I'm hoping that means now my students can use it to ssh access my server and code there. Excited to try this out with new students.
You know, there are modeless editors that are available through the terminal. There's Emacs, of course, which I recommend, but for beginners an editor like joe or nano, suitably configured, might do quite well.
Actually using vi/vim is a very useful skill to have. It is always available on a Unix system and works even across slow links.
Perhaps you can let them edit their files locally and instead of vim, teach them how to scp/rsync the files onto your server. Hopefully their computers are at least fast enough to run VS-Code.
Notepad++ (for Windows, there's similar things for other platforms) at least has ability to transparently deal with remote connections, maybe that would be good option here.
My mid end computer isn't fast enough to run VS code.
A raspberry pi can run VS Code so I'm a bit curious what you call a mid end computer.
mid end by which standards. my notebook is getting pretty low-end by today (i5-4200U) and it just runs fine...
Have you tried [Theia IDE](https://theia-ide.org/)? Remotely hosted VS Code?
If you have your projects on GitHub then you can use Gitpod (www.gitpod.io) It allows to define dev environment in code, and then spawn remote env in Google Cloud with access via the browser with VS Code like editor. Only GitHub authentication is required and it is free for open source projects up to 100 hours a month.
Disclaimer: I'm working on Gitpod.
Are their machines able to run VSCode?
I've had a really good experience with VSCode remote development tools (https://code.visualstudio.com/docs/remote/remote-overview). Both filesystem access and sharing local web servers works out of the box.
Happy to chat and see if we can figure sth out.
Glad to hear you are doing that. If I may ask, does the program/course has any online presence?
I'm trying to document everything we teach here: c0d3.com/book
Separate question: Is it possible to configure your terminal to forward to your local vim installation through SSH?
I don’t believe people who can’t learn vim can actually code something meaningful, but why not use any other editor like nano?
Do you think that figuring out vim is a good use for people who are probably time limited anyway and are trying to build useful skills? Using a vi clone is a lifestyle choice, not an employable skill. Frankly, so is nano or anything else that asks them to abandon the skills they may have developed for navigating their computing environment.
Computers are hard. Websites are complicated.
Learning vim is more of an obstacle than using Notepad++ and FileZilla would be.
As a beginner, you know only enough to pass by. You can do a lot with just passing by, so long as you don't bump into places you can't get yourself out of. "Just enough command line and vi to edit a file" is a bit magic, but plenty of the other stuff involved is going to be as opaque.
I think the skillset of "learn how to write websites in a short time" is going to involve some level of "here's this stuff I have to do in just the right way" anyway.
I think it depends on the situation. In my case, learning Vim was quite beneficial as I had to maintain servers along with backend development. For any frontend development, starting with a modern IDE will be any day faster than Vim.
being able to edit files on a remote server is not an employable skill? most unix servers will have vi or vim as lowest common denominator editor
Learning vim is 1% of skills at most. If these people can’t really afford to learn vim, how can they can learn I don’t know, React or Django.
(I think Vim is employable: being able to quickly edit a file in any Unix-like environment is very helpful. But that’s off topic.)
Anyway, if nano is not good enough there are hundreds of other decent editors available. My point is: “VS Code is good because vim is hard” is a very weak argument. VS Code is good, no doubt. But if it a teacher forces students to use Vim, and students have problems with Vim, then the problem is not in lack of VS Code.
I’m disappointed this wasn’t really about remote workers but about remote development as in code doesn’t live on the machine. That said, remote development tends to be awful, and more tools could be helpful if it’s not specific to a FBs particular implementation of remote dev shards. Learning that at Facebook you don’t develop things locally and likely are on a box with all sorts of things tracking your usage and access to everything lends even further credence to the big brother thing. I understand that for a codebase and app that’s too big to run on one client machine it makes sense, but what also makes sense is having piecemeal development environments where you just pull the components that you need.
Disclaimer: Work at Facebook
Not really sure what you are implying with the 'Big Brother' comment. The remote development servers are only used for writing/debugging code and not for other daily tasks. Even if they are tracking what tools/functions I am using on the server, what does that matter?
Personally, I have found remote development awesome because it enables engineers to start contributing to a huge product in the very first hour. No need to wait for the repository to clone, the dependencies to install, and the code to build before you can become productive.
> Personally, I have found remote development awesome because it enables engineers to start contributing to a huge product in the very first hour. No need to wait for the repository to clone, the dependencies to install, and the code to build before you can become productive.
I am nobody but I've had this nagging feeling ever since I started working on websites for big corporations about why can't I work on my code on my local machine with no network connectivity? Why do I need to talk to three different databases and two different services on five different servers? Why can't I just fake all those things during my development?
If anything, my not so humble opinion is that remote development further enables bad habits. Of course, remote development is a tool and is not to blame but I recently learned the term "hermetic" build [Google SRE]
>The build process is self-contained and must not rely on services that are external to the build environment.
Personally, I think we should work towards making it possible to run (or at least stub) the whole stack on a single physical machine - be it local or remote. What do you think? I think this is a trivial problem engineering-wise but I am not very good at selling ideas.
[Google SRE] https://landing.google.com/sre/sre-book/chapters/release-eng...
> Why can't I just fake all those things during my development?
Wiremock but it takes some work. I wish public API providers also provided Mock servers for developers.
Outside of web development, sometimes you need special hardware or workspaces configured to do development. Usually this can be done in a container, but that comes with its own annoyances. Having a central development server for large compiled code bases is really useful.
Not true. Worked on a real distributed payment system. Every service came with a mock clone or a single server mode to standup all services in the same box. Every developer had a powerful personal desktop (or two). I loved it. Everyone I knew there loved it.
See comment below. Just because it wasn't true for you, doesn't mean it's not true. I'm not talking about mocking services to run it; I'm talking about the development environment and sharing with a team of 20 or so on a code base that takes 45 minutes to compile without parallelizing the build. You also cannot mock GPU functions. There's simply no cuda emulator.
This is an ideal case and I wish it's like this for everybody. At least one of the payment providers I'm working with in a customer's project doesn't even have a sandbox. We must test with real money. Obviously we mock everything in unit and integration tests, but to know that it really works the customer must put money on the manual testing account.
Those are all valid trade offs, but its ignoring the issue raised in the AP.
Like, is an environment like this encouraging bad habbits?
But anyway..
Special hardware: such as? Very few systems can't be downscaled. Situations where you need this special hardware we are talking horizontal multi server setups anyway.
Special workspaces: be less special, improve tooling, improve build operations. Very few setups actually needs centralised configuration.
Large compilations: I'm not sold that a) there is many people with the justifiable need, and b) any real justifiable need is likely going to want on demand autoscaling of the compilation servers, i.e. development won't be local anyway.
I'm not trying to debunk everything you've said. It's a trade off and I've used central dev databases in the past for legacy systems and it worked well for those in the office.
All I can tell you is that iteration speed, testability of code, manual testing and all around team morale was DRASTICALLY improved by stubbing out that bottleneck in newer systems.
I'm speaking for my individual case and others I work with where we have codebases of over a million lines of c++, with many header-only libraries. On an 80-core server make -j can still take 3-4 minutes, and that uses all the resources on the machine. Trust me, I wish I could have something as fast that's not centralized. The closest I can think of is either a VM (slow/wasteful), or a container. The container would be really easy if everyone used vscode with the container development plugin, but not everyone on the team does.
For those that don't, it's much more friction to remember to start it up, etc.
Absolutely agree! If it's any consolation, there are teams that make this possible in a "big corporation" context.
tangent to your points: you are not "nobody"!
People don't like being tracked and Facebook has a long history of being deceptive around how and when it tracks people. Facebook also has a long history of selling the data it collects as a result of tracking even when the people being tracked try as they might to opt out.
Hence the "big brother" reference.
Frankly, I was/am a little leary of how Facebook might be "improving" the MSVSC Remote Development pack since I use it every day.
As an employee working on a corporate device everything will be tracked anyways, this is the most tin-foil take ever.
Your latter concern is at least a reasonable one, but it should all be open source anyways? Not that any of us have the time to audit everything. I doubt Microsoft is going to allow anything nefarious...
> As an employee working on a corporate device everything will be tracked anyways
Note that that isn't true in most (all?) European countries. It is illegal to eavesdrop on employees without a strong and particular reason.
You're telling me that in the EU things that happen on company assets are not tracked? I think any company could easily come up with a 'strong' reason (IP theft?)
Yes, believe it or not: things that happen on company assets are not tracked.
My employer might track me, but how does that mitigate my concerns that a 3rd party data aggregator like Facebook might track me as well as a result of installing a closed source plugin that they "improved".
I just addressed this in the second part of my comment. You would have to trust Microsoft/Facebook at this point, right? So if you don't and you really actually care (unlike 99.999999 percent of people), don't use closed source software, and audit every single line of the open source software you use, because I bet a lot less eyes are looking at much of what you use.
Lots of companies would block their developers from using a plugin if it was determined to contain any sort of non-trivial telemetry.
> People don't like being tracked and Facebook has a long history of being deceptive around how and when it tracks people.
you are mixing unrelated things. As an employee of a company it's unlikely you can raise any concern about your privacy when working on the company's main asset.
> Facebook also has a long history of selling the data it collects as a result of tracking even when the people being tracked try as they might to opt out.
provide a reference please
Not OP, but I'm also not sure how OP will provide a decent reference of constantly moving goalposts of privacy wrt to privacy implications of the Facebook platform. The default has always been towards public and noisy, even as Facebook has been forced to mature and realize there were privacy implications about things they were doing by default on the platform.
Despite a culture of "move fast and break things," things have never been broken from new more restrictive default privacy settings. Users who signed up in 2005 would still be an open book by default today.
My point was to elaborate on how some other user might feel that Facebook represent a big brother type actor.
However, you have suggested that there is or should be no privacy when doing development, but I'll remind you that not all devs work in large corporate environments and even when they do, they might reasonably expect to be watched only by their employer. It is frankly not 100% clear that Facebook will never have access to telemetry as a result of "improving" this set of plugins. They certainly did not suggest as much.
In my opinion, there is a dramatic difference between what my employer may or may not do to track me and what a 3rd party social media company may or may not do to track my development practices as a result of using a plugin.
While semantically I may have overstated that Facebook "sells" data, they unquestionably considered doing so between 2012 and 2014 despite promises made to the contrary. They also unquestionably shared data with other large data aggregators. Even if USD did not change hands I think we can be fairly certain that Facebook bartered in user data.
Here is your reference: https://www.cbsnews.com/news/facebook-gave-some-companies-pr...
Unrelated? How? It's Facebook. They have shown a repeated assualt and disavowement of social responsibility. I wouldn't expect Exon mobile to do very nice things in non oil contexts. Are you saying you would?
Reference? Where have you been? Cambridge analytica? It's advertising arms? It's all selling user data either directly or indirectly. Don't be so naive.
I don't honestly expect there to not be access control especially at a big tech company, but for me when its Facebook doing it I can't help but think there's some MBA that might put you in a "bottom 5%" productive employee cohort because your usage patterns just so happen to correlate with less productive employees. They could of course do that almost anywhere as a lot of business IT tracking software is ubiquitous especially in tech, but it feels more likely at FB.
> when its Facebook doing it I can't help but think there's some MBA that might put you in a "bottom 5%" productive employee cohort because your usage patterns just so happen to correlate with less productive employees.
I work at Facebook. I would say it's less likely there than elsewhere that I have worked due to how the review cycle is set up. People have this strange idea that Facebook is some kind of top down panopticon.
If you’re using a React pipeline on the remote server, how quickly does the page refresh occur? I do remote VM development where I work and a major pain point for me is how long it takes to refresh a page that I’m working on.
You navigate to the remote server in your browser to test your changes. You just need to save and refresh the page and the updated code is magically deployed to the page.
> Even if they are tracking what tools/functions I am using on the server, what does that matter?
It absolutely matters. I do not want to give facebook any data, especially about my development work.
The "what does that matter" attitude is the entire reason people do not trust Facebook.
I think you have some wires crossed.
YOU, a random developer working at some other company, or on your open source projects, are not sending data to facebook.
Facebook employees, using facebook's tools, running on facebook's dev-servers, to build facebook, ARE sending dev-tool telemetry to facebook's dev-tool-development team.
"What does it matter?" is referring to the latter.
> all sorts of things tracking your usage and access to everything lends even further credence to the big brother thing.
As a user, shouldn't you be happy if the usage / access to everything by Facebook developers is carefully monitored? That should significantly reduce the risk that some insider improperly accesses any of your data.
>As a user, shouldn't you be happy if the usage / access to everything by Facebook developers is carefully monitored?
Only in the sense that I would be happy that the serial killer that captured me uses sterilised blades. That would considerably reduce the risk of infection.
I thought the same. After working remotely for 3 years I was happy to learn that FB and Microsoft were starting a remote program. I was wrong...
Agreed. I read it the same way (Yay! Remote work!) and was similarly disappointed.
The trend seems to be the IDE and development environment will move to the cloud for larger companies. This is what Facebook and Google are already doing. It makes ramp up easier, environments more consistent and tooling more predictable. Also you can spin up multiple development branches of the same codebase without having to switch branches or stash changes (a big deal when you have one giant monorepo). And probably harder to leak any IP.
I might be becoming an old fart, but I really don't see the benefits. I use vim, so technically I've been able to do remote development for years, but I've never felt the need to. Git makes sure that I have easy access to the right code version, Docker and docker-compose make sure I have the right environment. I only run one version of the codebase at a certain point in time. After all, I'm mentally focused on one problem at the time, right?
Yes, running docker-compose for the first time takes a bit longer. But seeing it unfold and having all the parts present on my local system helps me understand this new system better. And really, how often does this happen?
Crossing international borders with company data and source code is also a huge risk nowadays, so much that several companies I know have banned it entirely. It's safer to keep everything on-site and have employees access whatever they need remotely.
AWS purchased Cloud9[0] a couple of years ago and is integrating it into many of their cloud offerings. It is now the default editor for Lambda functions, and while I don't use it extensively it seems to work OK.
Is the recent move to Visual Studio Code by all Facebook developers also part of this strategy?
> Also you can spin up multiple development branches of the same codebase without having to switch branches or stash changes (a big deal when you have one giant monorepo).
Could you elaborate on this? I’m not familiar with these cloud IDEs.
I've just set up remote development VM on my personal server and it's super seamless. Love this stuff, crazy fast. It's scary that I, a person who expressed my dislike with Microsoft in the past, got hooked on VSCode + Typescript.
But.
There is a possibility that in a corporate environment this tool could be more of a hurdle. Imagine that you don't have control over your VM. No root, no sudo, everything is monitored and scrutinized.
Sounds scary.
I'd rather have a bare metal machine with something like Proxmox just for my team's needs.
Why is everyone in this thread so into the concept of remote development, especially when it comes to the likes of Microsoft and Facebook leading the charge? Both companies harvest massive amounts of data about their users and have a proven track record of privacy violations.
What happens if this becomes the norm and IDEs are only in the cloud? Do these corporations decide who gets to code on their platforms and gain the ability to peep into the technical secrets of every competitor?
I will never support this.
I did remote development through Putty sessions using vim/tmux for several years. Would have been nice to have something like this.
Everything is better than "cloud desktops" and everything that includes remote desktop connection for UI, with bad font rendering etc. I don't use vim, I use IDEs and like when they have native and fast rendering
vscode is based on Electron. So even if you run it locally, it runs on "browser". So Remote Desktop is not needed.
Also there's some other features for remote development. You can for example develop inside Docker while running the UI on your desktop [1]. Or you can connect via ssh [2]. A bit like IDE split between client and server components. All the intelligence runs remotely, just UI runs locally.
[1] https://code.visualstudio.com/docs/remote/containers [2] https://code.visualstudio.com/docs/remote/ssh
Electron is not the browser - it's node.js + HTML DOM - I get what you are trying to say but being electron based doesn't really buy you anything in the scenarios you listed, native UI IDEs can and do use SSH filesystem and language servers in a same way VS code does (for example both those features exist in visual studio proper, and work even more reliably imo)
So there are two things in vscode, which make it interesting solution for remote development.
One thing is that it is based on Electron. This has enabled Microsoft to create Visual Studio Online[1], which is vscode accessed through browser (backed by container/vm hosted at Microsoft)
Another thing is the architecture which splits the IDE into client and server. It's not about file system access via SSH. It's actually the core of the IDE running on a remote machine and the UI part (client) talking to it via ssh connection.
Would be interesting to know if this was actually the master plan for vscode right from the beginning or if they realized these opportunities on the go.
[1] https://visualstudio.microsoft.com/services/visual-studio-on...
Node.js runs on V8, so if you add the DOM and HTML... you have, essentially a browser.
That doesn't really buy you anything distributed over a native UI + IO framework.
You could use the DOM based text editor (I think the VS code one is called Monaco ?) to build a cloud IDE (I've seen a few instances of this) but VS code has nothing specifically suitable for this over standard IDEs.
I hadn’t realized Nuclide was dead. It had a great experience browsing remote files systems (albeit quite a heavyweight setup, but the big problem with it is that it tried to do too much. It override almost every preference and plugin. It was crazy.
You just had to drink the kool aid and give up any thought of of using any of your own preferences, but it worked.
Guess I’m uninstalling it now.
Remote development via TRAMP is the number 1 reason I'm stuck with Emacs. More support via other editors is always appreciated.
The sad thing about tramp is that it requires code to be written agains it. When it was added to emacs, developers had not previously had to think that some buffers might be on remote machines and so their code would not magically work agains it. New code ought to be written with tramp in mind using functions that work remotely but this isn’t always done.
With a complex development environment (say compiler, separate autocompletion/jump to definition program, version control, maybe a test runner or other external programs) made out of emacs modes, it only needs one thing to not be made with tramp in mind or with bugs for the whole thing to fall apart in practice.
What I have is ssh access to a powerful development box and a relatively non-powerful desktop (possibly using multiple of these in one day). Everything runs on the dev box and I use emacs over ssh with x forwarding. This works fine (the network latency is low and I’m not super sensitive to it anyway) with some hacks to ssh back to the desktop to play sounds or open links, and a lot better than tramp or sshfs). But I am a bit worried about the future: emacs is becoming more graphically complex in various ways leading to more data needing to be transferred as the X protocol is less suited to it; and X (and in particular X forwarding) is dying for various reasons. Screen sizes are also getting bigger. A trivial update at 1080p takes a few hundred kB, a trivial update at 4K takes close to 1Mb, viewing an image takes a lot more.
The emacs future I’m hoping for isn’t so much a better tramp than a fatter emacsclient. That is, instead of using ssh to run emacsclient on a remote box, which causes the server to connect back through ssh to my X server to open a frame, I would run emacsclient locally which would talk (over ssh I guess) to the remote emacs and speak a more efficient protocol to it, and use modern apis to push the pixels onto the screen.
Emacs works great in a terminal, all the issues you mentioned disappear ;) (you didn't mention why you really need GUI Emacs) - I use emacs 95% of the time inside remote tmux sessions, you also have true color support is that your thing, what issues do you have?
Also, ITerm lets you open http links.
Emacs in a terminal misses out on various text properties, on things like client frames (or just multiple frames in general), on images, and on font/size variety. I suppose you may feel like you don’t want to ever touch the mouse for some reason but I think it’s often useful for scrolling, selecting, copying, and pasting. There are also a bunch of keys which can’t be differentiated by a terminal. Finally the gui wastes less space on buffer dividers and the fringe
Try VS Code Remote Development extension now. It is already working as great as native and you will not miss TRAMP anymore.
https://marketplace.visualstudio.com/items?itemName=ms-vscod...
I can't wait for this tooling to advance. It would be amazing to have one IDE that came pre-configured to work with "every" language and work on code bases of any size. With the addition of language servers there has never been a better time to build such tooling.
Has anyone found good documentation on now to get Visual Studio Online (Self Hosted) setup?
I used this one recently (using code-server). iPad focused but it's all the same: https://medium.com/@ow/its-finally-possible-to-code-web-apps...
BTW folks who haven't tried VS Code SSH remote yet, it's _very_ impressive. For all intents and purposes you get "local" dev experience on a remote box, connected over SSH. Python code complete and cross-referencing is much smarter than what you'd usually get in Vim, and if you want Vim keybindings, you can have that too. I still use Vim for a lot of things, but when I need to work on a large codebase and need better code navigation I find myself reaching for VS Code more and more. Kudos to the team, very impressive product, especially the remote stuff. I've never before seen anything as seamless as this.
What are the benefits of having local dev experience on a remote box vs coding locally and sending the code to the remote machine to run it there (e.g. Pycharm remote deployment)?
Some companies, like e.g. Google or Facebook do not allow company code to be stored on a laptop, even if it's company laptop. And even if they did have code there it would not be practical, because everything above the kernel is compiled from source and it would take forever without build artifact cache. That, BTW, affects you even if you work with Python, because Python has to pull in a bunch of custom C++ libs to call the various services, and guess what, those are also built from source. Add to that the fact that your laptop is usually a Mac, and you're developing for Linux, and not just any Linux, but a Google/FB flavor thereof, with custom hacks, custom toolchains, etc. The code you work with is mapped to your workstation using a custom filesystem, so you don't store it there either. It's all transparent to you, of course.
So it's not really practical to do "local" development even if that were allowed.
But even if you aren't at a FANG, your company workstation often has the "right" set up and it's much more powerful than a laptop, so if you're dealing with compiled languages it might just be easier to deal with it remotely, as long as the experience is seamless.
Ok, but what if you want to run the same code on multiple remote machines (e.g. launch several deep learning simulations)?
At Google you run it on Borg. At companies which have non-trivial on-prem compute you'll probably find k8s being used for cluster management nowadays, same idea. Without cluster management you'll have to copy it to the remote machine (and set up deps there)
Ok, so it seems like Pycharm is still the only tool which can do that
... you're using pycharm as a cluster management tool? I'm not sure if I'm impressed or horrified :P
Not a cluster, just three independent GPU servers. Each one has a remote interpreter configured in Pycharm.
I have such high hopes for remote development, I want this in all my editors/environment.
Semi-OT but: How are spinup times for different VM/host options? Would it be feasible to have a beefy dedicated machine with something on it that spins up a VM/container when I SSH to it, with all my stuff on it?
Would it be feasible to build something like that with any of the major cloud hosting providers? One always-on server accepting the SSH connection and forwarding it to a fresh VM or something?
It seems wasteful to resources allocated 24/7 when they will probably only be used 8/5.
Remember that the VM is running on a hypervisor that is on 24x7 regardless of whether there is a virtual machine there.
The beauty of a VM is that it can get migrated around, so those hypervisor hosts can get turned off when load is not there. This is something that is likely happening already.
There is already VSCode server:
https://github.com/cdr/code-server
What was better in Nuclide that is not in this one?
I love code server and have been using it for a while. Hope they don't get crushed by Microsoft's offer. Though I guess that was always a risk.
So after reading the article, by "partner" they just mean they replaced their internal thing with VS Code? Wow, what a monumental achievement!
> by "partner" they just mean they replaced their internal thing with VS Code? Wow
Incorrect - they built their own remote-editing solution (similar to vscode's built-in edit-over-ssh, but with a load more features), and have been working with Microsoft to get those features upstream.
VSCode is the only Electron based app that I freely install.
Just hope that with the jabs from React Native for Windows team does to Electron based apps, that eventually VSCode gets rewritten in React Native instead.
Article mentions "remote development" is trend now a days. Is it really so ?. ABAP has been on remote development from day one and its quite an old language.
This is good I guess but it's interesting to realize that news about vscode scare me.
It's in such a good place right now. With the exception of multi-monitor support (the only real issue with Electron, in my opinion, but a big one), everything about it is wonderful to the point where I feel that it can almost only get worse from here.
So far the team has been incredible, continuously improving without making it feel bloated and slow so hopefully my fears will continue to be put to shame for years. But yeah.
my ML team is currently developing remote environments along these lines. I think I'll cry the day we have build pipelines that can auto-upgrade our conda environments overnight.
ok, so Atom is dead
Read whole thread so far, haven't seen anyone mention their network conditions. Can people indicate to me if this is just as nice over Pacific ocean sized hops or is this just working nice when you're talking down the road / in the building?
> We’re making Visual Studio Code the default development environment at Facebook
IDE choice is a highly personal thing. This sounds awful.
It's the _default_, as they say in the article:
> There is no mandated development environment. Some developers use vim. Some use Emacs. And even more engineers use our internal, unified development environment called Nuclide.
"Default" readily morphs into "only supported". Into "if you use something else and complain to IT when it breaks, IT addresses the issue by replying 'try using an editor from this century'."
To my knowledge the "default" editor has changed ~3 times in the past 5 years (fbide, atom, vscode) - and yet I've been using vim this whole time with no problems, and never had anybody suggest that I stop ¯\_(ツ)_/¯
When everyone uses a different editor then no one gets support.
Default is definitely different than mandated. Facebook provides engineers a ton of freedom for their development environment.
There are many at FB where we would have to pry vim or emacs out of their hands, and, even then, we would not be successful. :-)
Having a recommended/default IDE is not the same thing as forcing everyone to use it.
In every place I've been, the IDE choice has been more of a team decision, configuration was personal (some short-cuts, colors, etc). Too much time went into the build environment and other development tools for us, plus if you wanted someone to come in and help (short term pair programming), you need to be on the same page.
> In every place I've been, the IDE choice has been more of a team decision
What OS(es) were you using at the time?
In my experience, IDE choice was a team decision in Windows (and perhaps Mac?) shops, where the build system was tightly coupled to the IDE.
But in shops that develop Linux code, the IDE and build system tend to be loosely coupled, and the IDE was a matter of personal preference.
Java is also like this. In Java shops it's pretty much Eclipse or GTFO.