1. Check if your CPU supports virtualization technology . Download CPU-Z and check instructions list for Intel VT-X or AMD-V.


2. Check if virtualization enabled on your CPU (CPU Configuration, BIOS). For example, on Intel Sr1670HV Intel Virtualization Tech and Execute-Disable Bit Capability must be enabled. Usually, you have to just find settings (in CPU Configuration) that contains “Virtualization” or “Execution” and enable them.
3. I had a problem with enabling hypervisor on my server and both (1,2) steps did not help me.Before it I have had an issue with OS loading, so I repared my server using bootrec.exe utility (/rebuildbcd and /fixboot parameters). According with http://msdn.microsoft.com/en-us/library/windows/hardware/ff542202(v=vs.85).aspx I checked my bcdedit (just type it into cmd) for the following parameter and noticed there were no one:
hypervisorlaunchtype [ Off | Auto ]
Controls the hypervisor launch options. If you are setting up a debugger to debug Hyper-V on a target computer, set this option to Auto on the target computer.
I solved it by running bcdedit /set hypervisorlaunchtype auto. Restarted server and successfully started my VMs.
how to restart server?
Hmm.. from start menu or powershell : Restart – Computer