Jump to content
Drewsky

VMWare Workstation PRO vs MS Hyper-V

Recommended Posts

31 minutes ago, Attila Kovacs said:

how can you work on a slow and laggy VM?

Given some proper hardware working in a VM is not noticeably slower. (OK, I can only speak for VMware Workstation here)

  • Like 1

Share this post


Link to post

The only thing that is slow in my VMs is 3D gfx, but since I don't rely on those - it is not a problem.

Share this post


Link to post
Posted (edited)

I do JS in the same VM. I have noticed that "animations" (bootstrapp-y css stuff) are turned off in the VM (both Chrome and Edge) but works in the Host.

The browser is just a test-area for me so i do not care. Had it been some similar "discord" in my desktop project(s)  would be concerned.

I'm running Hyper-V on same host.

Edited by Dany Marmur
Added the current choice of VMmgm.

Share this post


Link to post

I'm using VMWare Workstation Professional since many years and totally love it.

 

I have 2 external screens attached to my notebook (plus the internal one, which makes three) and I can very flexibly set VMWare to use any combination of screens and resolutions. This makes it very easy to test DPI-Aware applications. Another big advantage for my specific line of work is that VMWare lets me configure virtual COM Ports (which communicate with each other through named pipes) so I can test serial communication protocols even though my notebook has no "real" RS232 ports.  VMWare also supports virtual networks so I can define "private" networks between virtual machines that are isolated from others.  There are tons of ready-made VM's to download from the internet with appliances like routers etc.  

 

When configured right, a VM in VMware is not noticeably slower than the host machine.

Share this post


Link to post
On 5/19/2021 at 8:11 AM, Lars Fosdal said:

MS submitted changes to the Linux kernel in June 2020 that would allow running Hyper-V hosting on Linux.

Not sure if this is present in any current Linux release yet.

I don't believe it was about running Hyper-V on desktop Linux so much as being able to replace all but Hyper-V's micro-kernel with Linux (aka "Linux as the root partition"). It doesn't look like the user space patches have landed yet to allow actually creating virtual machines with it.

 

https://www.phoronix.com/scan.php?page=news_item&px=Microsoft-Linux-Root-Partition

 

 

Share this post


Link to post
On 5/12/2021 at 9:16 AM, Lars Fosdal said:

3D GPU performance is a mixed bag, imo - but why would you test that in a VM? 

You'd remote debug that from the VM on a physical machine.

 

On Linux with KVM one can reserve hardware for the VM to have dedicated access to. On my home desktop I finally got this set up correctly so that a Windows VM has exclusive access to an old second video card I stuck in the machine. I was able to run some 3D games in the VM essentially as well as the old card could before under native Windows. I hooked the card's output to the second input on my monitor and can toggle back and forth between fullscreen Linux and fullscreen Windows with hardware GPU. 🙂 The keyboard and mouse can be toggled between systems and the sound is virtualized although one could also dedicate these to the VM too if they have the extra hardware.

 

I *think* it's possible to configure things under some circumstances to automatically switch graphics cards such that, say, Linux boots with the newer graphic card but when the VM runs Linux gets the old card and the VM gets the better one, but I haven't tried this yet. If possible, it's not exactly easy.

  • Like 1

Share this post


Link to post
On 5/19/2021 at 7:49 PM, Attila Kovacs said:

I've already asked before, how can you work on a slow and laggy VM?

It sounds like washing your feet with socks on.

I've never encountered anyone except Delphi users who develop inside VMs. Seriously, no one else. I'm not sure why you'd want to do it either. You're going to end up with less RAM, slower CPU, slower disk access, and the need to maintain two OSes now. I can understand testing in a VM where isolation and reproducibility are important, but I don't understand the advantages of using a VM for development. Honestly, it might have something to do with how hard it's been historically to get a Delphi development environment set up (or upgraded).

  • Like 4

Share this post


Link to post
Posted (edited)
17 hours ago, Joseph MItzen said:

I'm not sure why you'd want to do it either.

 

EDIT: Man i did not read you post through, yes exactly that! Sorry. I need my coffee.

 

Because of the IDE ecosystem being what it is (and it might not be as bad as one thinks when having some problems).

 

First off, just installing two Delphi versions on the same machine gives a scare.

 

More importantly, if a client refuses to keep up with what i as a vendor do when i move up the versions, i need to retain the maintainability of that specific system. Having all; IDE, third-party components, plugins and all the tools around it; installed on a VM (i still needed to fire up Delphi 2009 in a xp four years ago) is a good way of not having to upgrade every little thing you have ever deployed. Just for example a TDataSet based lib from Delphi 3 to XE2. Sometimes (especially a decade, two decades ago) 3rdparty components would show breaking changes between Delphi versions. Oh, and older licencing schemes was based on when IDE version you were using. Imagine to pay up through 10 just because of once client who one paid up for a specific project.

 

 

Edited by Dany Marmur
Ignorance on my part,typos

Share this post


Link to post
16 hours ago, Joseph MItzen said:

I've never encountered anyone except Delphi users who develop inside VMs. Seriously, no one else. I'm not sure why you'd want to do it either. You're going to end up with less RAM, slower CPU, slower disk access, and the need to maintain two OSes now. I can understand testing in a VM where isolation and reproducibility are important, but I don't understand the advantages of using a VM for development. Honestly, it might have something to do with how hard it's been historically to get a Delphi development environment set up (or upgraded).

Some advantages of using virtualization:

 

Backing up a vm is as easy as copying a file. If you break something in a vm, it doesn't affect the host machine. If you decide to replace your host machine with a new one, you're up and running again within an hour. 

 

I've configured my VM to use separate virtual disk files for C: and D: and I keep my projects on D:.

 

Whenever I update Delphi, I archive the virtual C: drive first. This makes it dead easy to fire up an older Delphi version whenever I need it.

 

 

 

 

 

  • Like 1

Share this post


Link to post
16 hours ago, Dany Marmur said:

 

First off, just installing two Delphi versions on the same machine gives a scare.

I have 12 versions of delphi on my dev machine (XE2-10.4.2) all working fine. The trick is managing your system path environment variable.. which the delphi installer still rudely prepends it's entries to.. the result of which can push the path length over 2048 which on some older systems will cause issues with windows. 

The trick is to not install in the default location and keep the paths as short as possible. This blog post from 2014 always gets lots of views shortly after a new delphi release - I wonder why? 

  • Like 1

Share this post


Link to post
4 hours ago, Vincent Parrett said:

I have 12 versions of delphi on my dev machine (XE2-10.4.2) all working fine. The trick is managing your system path environment variable.. which the delphi installer still rudely prepends it's entries to.. the result of which can push the path length over 2048 which on some older systems will cause issues with windows. 

The trick is to not install in the default location and keep the paths as short as possible.

I can top that: 6 to 10.4 (I even considered adding Delphi 5)

And yes, the path variable becomes a pain in the lower back if you install many Delphi versions to the default directories. Unfortunately updates tend to revert any custom installation directories.

Using additional environment variables and adding only these variables to the path also helps.

  • Like 1

Share this post


Link to post
On 5/22/2021 at 11:53 PM, Joseph MItzen said:

I've never encountered anyone except Delphi users who develop inside VMs. Seriously, no one else. I'm not sure why you'd want to do it either. You're going to end up with less RAM, slower CPU, slower disk access, and the need to maintain two OSes now. I can understand testing in a VM where isolation and reproducibility are important, but I don't understand the advantages of using a VM for development. Honestly, it might have something to do with how hard it's been historically to get a Delphi development environment set up (or upgraded).

The same, or even much worse, situation is with MS VS. I use it from time to time to build some opensource projects so I've installed express 2010 and, for some old stuff, 2008. I assure you it's always a torture. VS destroys paths, messes up frameworks and gives mysterious errors that you have to search for. Moreover, even if you set up things correctly, it could just break some day because of some other system change. Dedicated VM's are the real salvation here.

Share this post


Link to post

As others said, it is easy to make backups.

It is also easy to experiment and share resources.

Would like to run something that might potentially damage your setup? Not something I would do without extra precautions on a physical host. But on a VM? It is easy to experiment. Take a snapshot, do it anyways. If it was as bad as you guessed, roll back the snapshot, otherwise commit the snap.
With backups that takes time to restore, with a snapshot it can all be done within the minute, including roll back and/or commit.

 

Another scenario I used in the past was where I had to support a multitude of database back ends for a client (Oracle/MSSQL/mySQL/PostgreSQL) Each SQL server was installed in its own VM. Need to support multiple versions of MySQL? No problem, add a VM. On a physical host this can get messy and with all the SQL servers each could burn through resources even when not using. Yes you can mess with starting/stopping services as an alternative, it really all depends on your needs on what is the better choice.

 

Also moving any of those VMs to a bigger host at any time (scale up/scale out) is easy.

 

re. slow/fast. I've run tests with development VMs on my MBP against physical host installs (server class, similar CPU type but xeon, SAS RAID) and the difference was negligible. On 2 minute compile times, testing the exact same project, the difference was seconds.

Once the VM had 2 vCPU's or more assigned, it was about as fast as the Windows setup on a physical host. The kicker? I had 4 more VMs actively running on that MBP when running those tests.

 

It is also easy to do things like kernel debugging. You can connect two VM's via the virtual serial port all on the same host.

 

Does this mean VM's are always better?
No, there are workloads that will always benefit when running at a physical host.
Audio low latency, video editing, games... they all do better on physical.

Share this post


Link to post
Posted (edited)

Thank you all for the help and comments.

 

I went with VMWare Workstation PRO and I am quite satisfied. The system is very responsive, all custom hardware that we are developing works trough USB and network. I am using Veracrypt for encrypted containers where I keep the source code (in case my PC gets stolen) and it work flawlessly.

 

My reasons for virtualization:

  • Privacy. I plan to switch to Linux by the end of the year. I don't want Windows to be my main OS any longer. I plan to use hardware with Intel ME, Webcam, Audio, WLAN/Bluetooth switchable. There are at ton of reasons, but that is another discussion.
  • Tedious setup. I have hundreds of my own libraries and packages. I have over 20 commercial 3rd party packages (tools) and they are all interconnected. This means that it takes a lot of time to setup system that truly works. I also need to keep older versions of RAD Studio Enterprise for debugging/compiling older products, where customers did not upgrade to newer software. With VMWare I will be able to switch between versions fast.
  • Portability/backups, I want to switch my development environment to another machine fast in case of main PC gets broken or stolen. I do backups almost on daily basis.
  • Testing. I want to test the software with clean Windows with our tools and databases preinstalled. But nothing else. Windows Sandbox is not useful in my case. Low spec machine simulation testing, etc.

 

I still have testing computers that are running native Windows OS for some development and testing. OFC the performance and speed of native development is important, that is why I will not switch my Unreal Engine development to virtualization, that would be disastrous. But I can setup my full Unreal Engine environment + Visual Studio in 5 hours and it also works natively on Linux.

 

Kind regards

Edited by Drewsky
  • Like 2

Share this post


Link to post

There is one unmentioned drawback to VMs. The more you have, the more you need to keep updated and patched. That is the price you pay for flexibility.

  • Like 1

Share this post


Link to post
1 hour ago, Lars Fosdal said:

The more you have, the more you need to keep updated and patched

The importance of updates is very overestimated

Share this post


Link to post

Another advantage of VMs is that you're able to do remote debugging and test your software on a different version of Windows than the one you're developing under.

 

I can easily compile a Delphi application on my Windows 10 development VM, deploy it to a network share on my Windows 7 VM and debug it on that operating system. Piece of cake.  The same goes for Linux.  It's even possible to run MacOS under VMWare, though Apple doesn't allow it unless the underlying host is an Apple machine. 

 

 

The latest VMWare version can co-exist with Hyper-V on the same host machine by the way.

 

  • Like 2

Share this post


Link to post
2 hours ago, A.M. Hoornweg said:

The latest VMWare version can co-exist with Hyper-V on the same host machine by the way.

 

Supported since VMware Workstation 15.5

 

Yet another advantage... I develop for macOS, on my mac I run Delphi in a VM, I can deploy on the mac at my host and when I debug, the screen will switch to the host automatically. Very smooth.


Yesterday I got a bug for latest Big Sur updates.. I fire up a VM that's slightly older and upgrade it to the latest Big Sur version to verify it is indeed an issue in this version only.

 

You can also take that Windows 10 VM that runs on macOS Fusion and run it on VMware Workstation under Windows .. or Linux. This is exactly what I do when travelling. It is also easy to keep an encrypted version of your VM on a microSD card, so you have something to fall back on in case the laptop goes missing. There are too many advantages to not use it.

Share this post


Link to post
9 minutes ago, Lars Fosdal said:

It is good to alert users, but just to be clear this is for vCenter Server, not for VMware Workstation/Fusion/Player.


Normally VMware vCenter server should not be directly accessible from the internet (yes, some people/companies do).

However even when on a local network, it is still good to run the vCenter Server appliance update, if you run it.

Share this post


Link to post
Guest

Lars, you are bringing this 3 days after that article while digging just little deeper it seems that vulnerability might be already known be in January !, only now being patched.

 

Anyway, you keep updated i suggest to follow the CVE register itself instead of depending on articles here and there.

https://cve.mitre.org/index.html

Use search from there and for easier and faster usage bookmark your own links like this one

https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=vmware

 

And just like Wil mentioned, keep in mind that, the article mentioned CVE-2021-21985 which does belongs to vSphere https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-21985 

Also the disclaimer clear that the number and date is not relevant but the the issue is was recorded publicly by VMWare on last week of May, so according to the dates they are always late.

https://www.vmware.com/security/advisories/VMSA-2021-0010.html

 

There is hundreds vulnerabilities that used by malicious parties where the authors know about it and didn't disclose their knowledge because simply they don't understand how to reproduce, only after they do understand it, they can prepare a fix/patch after that they will either sleep on it or announce it.

 

 

Ps: Lars, PSRemoting is dangerous too, i can't trust it the way you do, the minimum you can do is to limit access to it by using a FireWall with whitelist IP's for specific ports, never ever leave it open for everyone, the same goes for all/any OS components,

Share this post


Link to post

The VMware thing popped up in my feed. My gut reaction was to bring it forward.

 

I agree - PS Remoting has risks.  But - if an intruder is already inside the domain - you are already in trouble.

We only allow it for some specific domain users and it is explicitly enabled for specific machines. 

The initiating host must be inside the domain.

 

 

 

Share this post


Link to post

Running multiple VMs on VMWARE Player with no issues on windows.  Threadripper 12/24 core/threads, 3 vms running all the time plus main machine, no issues all smooth.

 

Also running on another a xcpng (Free xenserver) with windows/linux servers.  VMWare is a good choice in general, their only reason to exist is virtualization, cant go wrong with them.

 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×