Bug Configuration

Cannot transfer files anymore from my Galaxy Nexus through USB

Last friday, I tried to hook up my Galaxy Nexus phone through USB to transfer some files to my computer. After a few seconds, a completely empty Nautilus window appeared. Once again, Ubuntu was incapable of detecting my device. This happened a few versions ago and I had to use obscure and impossible to remember MTP commands to perform transfers. I didn’t want to search for these commands again, tired of loosing more and more time with artificial problems. If Ubuntu degrades at this point from one version to the other, I will be better off switching to Windows and install Linux only in virtual machines. This is not the only issue I got and most bugs (mouse pointer, Emacs, M-Audio sound stability) persist with version upgrades. Canonical now seems to focus on Mir and newer Unity versions, which I really dislike because Mir will break everything down for five or six versions. Either this, or Canonical will cut corner on keyboard accessibility, resulting in a UI that will be almost unusable for me. I exptect this will  be my hardest Linux time ever when that beast comes out, until it stabilizes.

However, my actual bug was worse than this: the phone didn’t connect through Windows as well! When I plugged in my phone on a Windows machine, an empty Explorer window similar to below comes up and nothing else. Does that mean my phone is dying, progressively loosing functionalities? Probably.

Capture d'écran 2014-11-30 09.04.15

I didn’t atempt any Google search about this. It is worthless. I will find forum posts about people replacing the cable, doing factory resets, sending their phone for repair or replacement, etc. The phone hooks up, something is detected by the computer, so why a new USB cable would help? Yes, I can do factory resets after factory resets, that will probably fix it, but what’s the point of doing this if I know I will have to redo it a few months later, for no reason, unless I install NOTHING on the phone? And how I am getting through technical support are no-go solutions or offers for a new phone that will force me to switch to a more expensive plan with my provider or buy something from nowhere with an old version of Android.

Before accepting this conclusion and starting to look if I could get a Google Nexus phone from Google rather than going through Fido and affecting my phone/data plan or get something somewhat correct from, I tried with a different cable: same effect. Then I saw the home screen on the device and remembered I set up a PIN recently to protect my Google account for any tampering by somebody who may get back my phone if I loose it, or steal it from me.


I entered my PIN and saw with surprise and relief the following window on my computer:

Capture d'écran 2014-11-30 09.05.34

Tada! This time, the solution was simple! This was just a normal data protection! So my USB connection is still working!

Note that locking the phone doesn’t shut off the USB access, until the cable is disconnected. The PIN also doesn’t prevent me from answering a call, so this is not as problematic as I feared it would be.


One SSD for my HTPC

A bit more than a month ago, I successfully transferred my dual boot Windows 8.1 and Ubuntu 14.04 from two 120Gb Solid State Drives (SSD) to a single 240Gb drive.  I got several problems restoring the bootloaders of the two operating systems, thought many times I would have to reinstall, then figured out a way to make them boot.

But what happened to the two drives I removed from my main computer? Well, they sat still on the top of a shelf. But at least one drive will be repurposed: become part of A.R.D.-NAS, my HTPC. Sunday, October 26 2014, I finally got the time and courage to undertake the transfer operation. This time, the software part was pretty smooth, but the hardware part was a uselessly intricate puzzle. During the process, I wondered myself many times about the purpose of generic hardware if it doesn’t fit well together and pestered about the lack of any viable alternatives.

The sacrifice

Well, my NMedia HTPC case has six 3.5″ drive bays. This is quite nice for an HTPC case. This is possible, because I chose an ATX case, to get a motherboard with rear S/PDIF audio connectors rather than just headers accepting brackets I could get nowhere. This case is a bit bulky; I would build off a MicroATX case if I had to start from scratch.

So installing this SSD seemed obvious at start: just add the drive, transfer the Linux boot partition from the hard drive to SSD, remove the original boot partition, setup GRUB on SSD and tada. No, things are rarely as simple. I thought my motherboard had only 4 SATA ports, and they were all used: one 1Tb hard drive, a second 1.5Tb hard drive, a third 3Tb hard drive, then a blu-ray reader/writer. Why so many hard drives? Well, I am ripping and storing all my video disks, even the huge blu-rays, to avoid the need for searching for them on shelves.

Even if I remembered correctly I had six ports on the motherboard (two are free!), my PSU only had four SATA power connectors, so I would not be able to easily and reliably connect all my drives. I could try to find some splitter cables or molex to SATA adapters, but that would add a factor of failure. I could replace my PSU for one with more SATA power cables, but it would also have more molex cables, PCI Express connectors, etc. Unless I went with a more expensive modular PSU, all these cables would have cluttered my case.

Safest and cheapest solution was to sacrifice one of the hard drives, the 1Tb one of course, the smallest. I thus had to move files around to have less than 120Gb of stuff on the hard drive that would be moved to SSD. That process took a lot longer than I thought. My poor HTPC spent the whole Saturday afternoon copying files around! Fortunately, this is machine time so I had plenty of time to experiment music creation with my still new UltraNova synthesizer combined with Ableton’s Live multri-track ability.


On Sunday, I first burned the Ubuntu 14.04 ISO on a DVD. Yes, Ubuntu is now large enough to fit only on a DVD. After that, I shut down my Minecraft server running on my HTPC and moved its files to another old PC. I started the server on the old PC and reconfigured the port mapping on my router. This way, if my friend wanted to kill a bit of zombies and creepers while I was installing my SSD, he would be able to do so and I would not be stressed if something bad made my HTPC out of service (like something stuck in the CPU fan breaking it).

I then removed the cover of my HTPC and spent quite a bit of time trying to figure out what was the 1Tb hard drive. Based on the position of the SATA connector on the motherboard, I presumed that was the left-most drive. I thus had to disconnect the drive in the middle bay, and use the freed up power and data connectors to hook up my SSD. I then booted up the machine.

Following picture shows the drive temporarily hooked up.


Then I remembered about my old 22″ LCD that I stopped using after purchasing my Dell touch screen. I went pick it up in my computer room, put it on my kitchen table and plugged it in. This way, I would be able to have the screen right in front of me, keyboard on the table, rather than in front of my 46″ LCD HDTV with the keyboard on my knees.

The SSD and the LCD hooked up, I booted up my HTPC and quickly sticked the Ubuntu DVD in the blu-ray drive. After an incredible amount of time, the machine finally booted up into Live Ubuntu DVD!

Data transfer

After Ubuntu started, I launched GParted and realized that I chose the wrong hard drive. The 1Tb drive containing my Ubuntu setup was disconnected. Oh no! So that means I will have to turn off the machine, connect the right drive and wait once again for this stupidly long, almost five minute, DVD-based boot? No, not this time! Feeling like a cowboy, I decided to try something: drive hot swapping. This is possible with SATA, so let’s see! I thus disconnected the 1.5Tb hard drive, starting with the SATA data cable, then the power cord, then hooked up the 1Tb drive. Hearing the old hard drive coming back to life was kind of thrilling. Everything went well, no cable stuck into my CPU or rear fans, and the PC didn’t freeze like it would do with IDE. The hot swap worked.

After that, this was relatively straightforward. As with my main PC, I used GParted to transfer my Linux boot partition and reconstruct the layout. I fortunately remembered, before, to reset the partition table. If I didn’t do that, the GPT that was on my SSD would have caused booting issues that would have drove me mad! I would probably have ended up reinstalling everything, angry against Ubuntu, the technology and probably the whole human kind. A single step, recreate the msdos partition table from GParted before the transfer, saved me that!

Following picture shows my LCD on which we can see the progress of the transfer.


See how bulky was this setup: HTPC on the floor, case opened, SSD hanging on top. Hopefully it was possible to make this setup clean once again after all this.


The home partition: too big to fit on the SSD

Unfortunately, GParted didn’t want to transfer my home partition to the SSD, because it was obviously too large. I could have shrunk it in order to copy it, but I wanted to avoid altering the hard drive in case something bad happened. I thus instructed GParted to simply create a blank Ext4 partition and used cp to perform the copy. The following terminal session shows how I managed to do it in such a way that all files metadata (timestamps, ownership, permissions) was preserved.

ubuntu@ubuntu:~$ mkdir /media/old-home
mkdir: cannot create directory ‘/media/old-home’: Permission denied
ubuntu@ubuntu:~$ sudo mkdir /media/old-home
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sda

Disk /dev/sda: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000e2c4d

   Device Boot      Start         End      Blocks   Id  System
/dev/sda1   *          63    40965749    20482843+  83  Linux
/dev/sda2        40965750    81931499    20482875   83  Linux
/dev/sda3        81931500  1953520064   935794282+   5  Extended
/dev/sda5        81931563  1943286659   930677548+  83  Linux
/dev/sda6      1943286723  1953520064     5116671   82  Linux swap / Solaris
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sda5 /media/old-home/
ubuntu@ubuntu:~$ ls /media/old-home/
eric  lost+found  mythtv
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sdg

Disk /dev/sdg: 120.0 GB, 120034123776 bytes
255 heads, 63 sectors/track, 14593 cylinders, total 234441648 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000d9a0c

   Device Boot      Start         End      Blocks   Id  System
/dev/sdg1            2048    40968191    20483072   83  Linux
/dev/sdg2        40968192    81934335    20483072   83  Linux
/dev/sdg3        81934336   234440703    76253184    5  Extended
/dev/sdg5        81936384    92170239     5116928   82  Linux swap / Solaris
/dev/sdg6        92172288   234440703    71134208   83  Linux
ubuntu@ubuntu:~$ sudo mkdir /media/new-home
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg6 /media/new-home/
ubuntu@ubuntu:~$ sudo cp -a /media/old-home/* /media/new-home
ubuntu@ubuntu:~$ ls -a /media/new-home/ -l
total 36
drwxr-xr-x  5 root root  4096 Oct 26 19:49 .
drwxr-xr-x  1 root root   100 Oct 26 19:44 ..
drwxr-xr-x 69 1000 1000 12288 Oct 25 22:52 eric
drwx------  2 root root 16384 Sep 26  2009 lost+found
drwxr-xr-x  3  122  130  4096 Jan 24  2011 mythtv

The main idea is to mount both the old and new partitions, then use cp with -a option and root access (with sudo) in order to preserve everything. The operation went smoothly.

The boot loader

Even after copying all Ubuntu-related data from my old hard drive, my SSD was still not bootable. To make booting off the SSD possible, I had to install GRUB. Unfortunately, reinstalling GRUB on Ubuntu is not as simple as it should be. If there is a package doing it, why isn’t it built into Ubuntu’s image? Maybe because for most setups, reinstalling from scratch takes 15 minutes. That’s true, but then how about tweaks to fix mouse pointer too small, make XBMC work with S/PDIF sound, reinstall MakeMKV, etc.? Each step is simple, at least when no unexpected difficulty creeps in, but the sum of small things to tweak makes it long.

So let’s avoid this by running the following!

ubuntu@ubuntu:~$ sudo mkdir /media/ubuntu
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg1 /media/ubuntu/
ubuntu@ubuntu:~$ sudo mount --rbind /dev /media/ubuntu/dev
ubuntu@ubuntu:~$ sudo mount --rbind /sys /media/ubuntu/sys
ubuntu@ubuntu:~$ sudo mount --rbind /proc /media/ubuntu/proc
ubuntu@ubuntu:~$ sudo chroot /media/ubuntu
root@ubuntu:/# grub-install /dev/sdg
Installing for i386-pc platform.
Installation finished. No error reported.
root@ubuntu:/# update-grub
Generating grub configuration file ...
Warning: Setting GRUB_TIMEOUT to a non-zero value when GRUB_HIDDEN_TIMEOUT is set is no longer supported.
Found linux image: /boot/vmlinuz-3.13.0-37-generic
Found initrd image: /boot/initrd.img-3.13.0-37-generic
Found linux image: /boot/vmlinuz-3.13.0-36-generic
Found initrd image: /boot/initrd.img-3.13.0-36-generic
Found linux image: /boot/vmlinuz-3.13.0-35-generic
Found initrd image: /boot/initrd.img-3.13.0-35-generic
Found linux image: /boot/vmlinuz-3.2.0-61-generic
Found initrd image: /boot/initrd.img-3.2.0-61-generic
Found linux image: /boot/vmlinuz-3.0.0-17-generic
Found initrd image: /boot/initrd.img-3.0.0-17-generic
Found linux image: /boot/vmlinuz-2.6.38-12-generic
Found initrd image: /boot/initrd.img-2.6.38-12-generic
Found linux image: /boot/vmlinuz-2.6.32-25-generic
Found initrd image: /boot/initrd.img-2.6.32-25-generic
Found linux image: /boot/vmlinuz-2.6.31-21-generic
Found initrd image: /boot/initrd.img-2.6.31-21-generic
Found linux image: /boot/vmlinuz-2.6.28-16-generic
Found initrd image: /boot/initrd.img-2.6.28-16-generic
Found memtest86+ image: /boot/memtest86+.elf
Found memtest86+ image: /boot/memtest86+.bin
Found Ubuntu 14.04.1 LTS (14.04) on /dev/sda1

The main idea here is to create a chroot environment similar to my regular Ubuntu setup, then install GRUB from here. I ran update-grub to make sure any disk identifier would be updated, pointing to the SSD rather than the old hard drive. Unfortunately, a small glitch happened: update-grub detected the Ubuntu setup on my hard drive. To get rid of this, I had to unmount the old hard drive and unplug it! After rerunning update-grub, I got the correct configuration.

Updating mount points

Since I rebuilt the home partition rather than copying it, its UUID changed so I had to update the mount point in /etc/fstab. I thus had to run the following:

root@ubuntu:/# cd /dev/disk/by-uuid/
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg6
lrwxrwxrwx 1 root root 10 Oct 26 15:37 fb543fcb-908a-463d-bc1f-896f1892e3ad -> ../../sdg6
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg1
lrwxrwxrwx 1 root root 10 Oct 26 16:11 54f4cbd6-aed0-4b43-91c0-2f8d866f3ee3 -> ../../sdg1

After I figured out the UUIDs, I had to open up /media/ubuntu/etc/fstab with gedit and make sure the mount points were correct. I only had to update the UUID for the /home partition.

Test boot

After all these preparatory steps, it was time for a test! I thus powered off the computer and made sure my old 1Tb hard drive was unplugged and my new SSD was hooked up. I turned on the PC and waited for the forever lasting BIOS POST. Why is it so long to boot a desktop while a laptop almost instantly hands off control to OS? After BIOS handed off control to OS, I got a blank screen with a blinking cursor, nothing else. I tried a second time: same result.

So after all these efforts, do I really have to format and reinstall from scratch? It seems so. Before doing that, I rebooted my machine once again and entered into BIOS setup by hitting the DEL key. Once there, I looked at the hard drives and found that the SSD was hooked up but it was not at SATA port 0.

I turned off the machine and connected the drive into a different port, what seemed to be the first. Looking into the BIOS setup again, my SSD was now at port 0. Ok, let’s try that a last time!

After a blank screen lasting too many seconds for a SSD boot and making me fear for a frustrating reinstall, the Ubuntu logo appeared, and my desktop finally came up! A quick check confirmed me that all the hard drives were present, except of course the disconnected 1Tb one. The SSD was ready to be installed into the machine!

The hardware part

The downside of SSD is that they seem not to fit in any regular desktop cases, only in laptops! This is a very frustrating limitation. Why are these drives all 2.5″ or why cases don’t have 2.5″ bays? When I shopped for my computer case, only high end ones had the 2.5″ bays and that was coming with fancy mechanisms to make the drive removable without plugging any cables, something adding into problem factors. Maybe at the time I am writing this post, some cases with SSD bays are available, but that doesn’t matter; I won’t change my case unless I really need to!

Before installing the SSD, I first removed that old 1Tb drive. I just had to remove four screws from my drive cage and slide the drive out.



To help me install my SSD into my HTPC case, I had a bunch of screws as well as an OCZ bay bracket. Just screwing the drive into the adapter’s tray took me forever, because I had trouble finding screws that fitted, there was a screw in one of the SSD hole I don’t know exactly why and that took me almost five minutes to realize. I then had trouble aligning the screws with the hole, was getting more and more tired and prone to drop screws, etc. At least, the screw I dropped fell on my table so I didn’t have to search it on the floor forever.

Following picture shows the drive in the bracket.


Then I had to screw that assembly into the drive cage of my case. Unfortunately, the upper bays of the cage only offer bottom holes while the SSD adapter has only side screws! I thus had to screw the adapter in one of the bottom bays, which are definitely suited for hard drives with their rubber pads to absorb vibration. None of my screws fitted well. It seems that the SSD adapter has holes smaller than normal while the screws for the drive bay are larger than usual! I got it after more than 15 minutes of attempts. I thought many times I would have to postpone this job and wait for my father to come by with a drill and make some new holes into the SSD bracket or the case.

Following picture shows the drive in the cage.


I don’t know exactly how much time I spent on this installation, but at the end, I was tired and was asking myself if all this would be worth it in the end.

Well after the SSD was screwed and the drive cage back into my HTPC case, I realized I wouldn’t be able to hook up my four SATA drives! No matter what I tried, there was always one drive lacking power. This was because the SATA cable coming out of my power supply unit were too short to accommodate the drive layout I came up with! Ok, I’m at a dead end now.

Before giving up and bringing that beast to a computer store in the hope they would figure out a way to hook the four drives up (maybe with some extension cable I don’t have, or using a new PSU), I remembered that the 1Tb drive I removed was in the middle upper bay which was now empty. My only hope to get the drive powered this day was thus to move one of my hard drives there. Ok, so let’s remove the cage again and play with the screwdriver once more!

I moved my 3Tb drive from the side bay to the upper one and put the drive cage back into my case. I was then able to hook up power. Reaching the SSD drive in the side bay to hook up SATA cables was a bit tricky, but I finally got it. A last check confirmed that all my drives were hooked up, except my blu-ray writer. Ok, just a cable to plug in, and that was it!

Was this all worth it?

After all this hassle, I asked myself this question. When I booted up the machine, it seemed as slow as with the hard drive. What? Maybe the CPU is too slow, after all. But when I reached the desktop and started XBMC, I felt the system was more responsive.

More importantly, the machine became a lot more silent. Since a few weeks, this HTPC was making a lot of noise. I thought it was the CPU fan stressed out by the Minecraft server running on the system, but the 1Tb hard drive was contributing to the noise as well. I suspect it was emitting more and more heat, causing the temperature to raise inside the case and heating up my poor little CPU. The CPU fan was then reacting by spinning like crazy.

Even after I restarted my Minecraft server, the sound didn’t come back. I am still surprised by this effect which I didn’t expect.

This 1Tb hard drive is definitely getting old and emitted some suspect sounds a few times. I am wondering if it would have failed and died if I left it in the machine. This SSD move thus saved me an unexpected reinstall and will help me have a better time with this HTPC.

So yes after all it was worth it!


One SSD instead of two: simpler or not?

My Core i7 machine, named Drake, had two 120Gb SSD drives. I purchased the first one with the machine and put Windows 7 and Ubuntu on it. Then I needed more space to get Mac OS X, so I added a second 120Gb SSD. Mac OS X became a pain, almost unusable because everything was too small. When I reached the point I had to lower screen resolution to get Thunderbird running comfortably, I got rid of Mac OS X. Then Windows 7, upgraded to Windows 8, started to eat up more space so I needed to move Ubuntu to the second SSD.

I ended up with a brittle configuration composed of the ESP (EFI system partition) on the second SSD, Windows 8.1 on the first drive and Ubuntu on the second. I was waiting for a special deal on a 240Gb SSD and finally got one on TigerDirect at the beginning of September 2014. However, purchasing the SSD is only the easy part. Migrating data from two SSD drives to a single one, with Windows 8.1, Ubuntu 14.04 and UEFI in the way, is an incredible source of headache. This page shows how I got it done.

The easy way: reinstall everything

That would have worked ten, maybe even five years ago. Yes, just reinstall Windows, a few drivers, a few programs, put back Ubuntu, perform some settings, fine tune a bit, and enjoy the rebirth of the system, coming back to life and full functionality. Things changed with years, not for good. Now that Microsoft and other hardware manufacturers assume people won’t install by themselves and rather purchase hardware with everything preinstalled and preconfigured, things became more and more time consuming to setup. Just installing Windows 8 takes more than 45 minutes, and although I could obtain a DVD with Windows 8.1, my Windows 8 serial number won’t work with it. I would have had to install Windows 8, then upgrade to Windows 8.1 again!

Then come the drivers. Since I purchased my motherboard before Windows 8 was released, all my motherboard CD has to offer is Windows 7 drivers. So I cannot use the easy auto-install tool performing an unattended setup. I rather have to download every driver separately from Asus, run them, wait, reboot, run the next one, etc. Then there is the NVIDIA driver, requiring 100 Mb of download and yet another installation taking more than five minutes, and yet another reboot. Maybe I chose the wrong motherboard. By sacrificing a few USB ports, S/PDIF audio and maybe some PCI Express slots, maybe I could get something simpler not requiring as many drivers, that would be able to make use of what is prepackaged within Windows. That’s still to be investigated.

Then come the programs. Yes, Ninite can install me many programs automatically but not GNU Emacs, GNU GPG, it won’t configure my Dropbox, resync my Firefox bookmarks, reinitialize my Thunderbird email settings. It won’t link back my Documents, Images, Music and Videos default folders to my data hard drive.

And then come the licenses. How Windows 8.1 activation will behave? Will it happen smoothly, or will Windows decide that this change of SSD is too much and require me to call Microsoft to perform activation by phone, forcing me to exchange, by voice, on a poor channel, tens of nonsensical digits? After Windows 8.1 activation, my DAW, Live from Ableton, also requires authorization. I’m not sure it will reauthorize, since I activated it on my main PC as well as my ultrabook. That means additional hassle.

Bottom line, reinstalling is a pain, and that is just the Windows side. Ubuntu installation is usually smooth, but when a single thing goes bad, it requires hours of Google searches.

This is why I wanted a better way. I was so tired of this tedious process I was considering giving up on this machine and use my ultrabook instead, if data transfer failed. But my ultrabook, with its 128Gb SSD, won’t have enough storage for editing music made of samples or recording/editing Minecraft videos.

Preliminary connection of the new SSD

Before installing the new 240Gb SSD into my system permanently, I wanted to be sure I would be able to transfer my two operating systems (Windows 8.1 and Ubuntu 14.04) and make them boot. I thus only plugged the disk rather than attaching it right away into my case. I fortunately had some free SATA power cables as well as an extra SATA cable and port. That allowed me to connect the new drive without disconnecting the others. This way, it would have been easy to roll back in case of difficulties forcing me to reinstall everything, and then think about another strategy or gather my courage and patience for the full reinstall.

I then booted from a USB stick with a Live installation of Ubuntu 14.04. This was necessary to perform the data transfer on a totally offline, clean, file system.

Before transferring anything on the drive, I ran a SMART self test. For this, I installed smartmontools with apt-get and ran sudo smartctl -t long /dev/sdb. At this time, /dev/sdb was the device of the drive. That took almost an hour, but I could leave this running and do something else.

The self-test found no defects. I learned to do this preliminary step the hard way when I assembled a machine for my parents. The hard drive failed short while I was configuring Windows and I had to RMA it. Performing a self-test may have avoided me a waste of time and some frustration.

The drive being clean from any defect, at least from the point of view of the self test, I moved to the next step: data transfer.

GParted is the king!

A long time ago, my only friend for partitioning and drive transfer was Parition Magic, from PowerQuest, now purchased by Symantec. That time is over, thanks to GParted, a free open source tool that comes with Ubuntu. But that time, my job was pushing GParted to the limits. Here are the operations I needed to perform with it:

  1. Create a GUID Partition Table (GPT) on the new SSD. This is because I want a pure UEFI-based system. But this is not strictly necessary since the drive is far from the 2Tb limit!
  2. Copy the first partition of my second SSD at the beginning of the new drive: this is the ESP.
  3. Copy the first partition of the first SSD: this is the 128Mb system reserved partition of Windows. That copy wasn’t possible, because GParted didn’t know the partition type. I thus left a 128Mb hole declared as Unformatted, to figure out a way out later on. I was hoping Windows could recreate the data on this partition.
  4. Copy the second partition of the first SSD: this was the Windows main partition.
  5. Copy the 40-ish Gb partition of my second SSD at the end of the new drive: this was my home drive from Ubuntu.
  6. Copy the 20-ish Gb partition of my second SSD at the bottom of the free space on new drive: this was my main Ubuntu installation.
  7. Create an extra 20 Gb partition on the new drive in case I would like to give a shot to a new Linux distribution.
  8. Create a 16Gb swap space on the new drive for Ubuntu’s use.
  9. Resize my Windows main partition to take the rest of the space.

Phew!This long sequence gathering pieces from different sources reminds me of infusion crafting in the Thaumcraft mod of Minecraft, where essentias and items are combined together on an altar to craft powerful magical objects.

I hoped that sequence would work, but that failed at step 5. For no obvious reason, GParted wasn’t able to copy my Ubuntu home drive at the end of the new SSD! I had to leave an 8Mb gap and then resize the partition to fill it. I then performed, one by one, the other operations. That was a quite tedious job, because the mouse pointer was too small and impossible to enlarge without a system hack (Ubuntu bug since 11.10! They chose to remove the option to resize mouse pointer rather than fixing the issue.) and sometimes clicking was opening the menu and closing it right away rather than leaving it open.

Following image gives the final layout. Isn’t that great? Not sure at all this is simpler with one drive than with two, after all…


After this transfer process, I tried to recreate the entries in my UEFI’s NVRAM, using efibootmgr, for Windows and Ubuntu. I then unplugged the SATA cables of my two 120Gb SSD drives from my motherboard and rebooted the PC. I won’t state the exact commands I used here, because that just failed. System wasn’t booting at all.

Fixing Ubuntu

Back to my Ubuntu live USB, after at least five attempts because my motherboard is apparently defective and misses the F8 key from time to time and the need to jump into Setup and change the boot order from there to boot the UEFI USB stick. Boot time with that Asus board is desperately long. Waiting 15 to 20 seconds from power up to boot loader is a shame when knowing it takes less than 1 second on a 300$ laptop! But the laptop lacks storage expandability I need, so I am always stuck on one end or another.

Then comes the fun part. I am pretty surprised there is no easier ways to restore GRUB than the following. I read about boot-repair, but it is just missing, probably yet another PPA to copy/paste and install. Anyway, I ended up getting it to work.

First I found the partition where Ubuntu was installed, /dev/sda5, and mounted it: sudo mkdir /media/ubuntu && sudo mount -t ext4 /dev/sda5 /media/ubuntu. I did the same with my ESP: sudo mkdir /media/efi && sudo mount -t vfat /dev/sda1 /media/efi.

Second step was to establish bindings:

sudo mount –rbind /dev /media/ubuntu/dev
sudo mount –rbind /proc /media/ubuntu/proc
sudo mount –rbind /sys /media/ubuntu/sys
sudo mount –rbind /media/efi /media/ubuntu/boot/efi

That caused some directories inside my Ubuntu mount to mirror exactly the top level directories.

Then I had to chroot into my Ubuntu, using

sudo chroot /media/ubuntu

After all this, system was behaving a bit the same way as if I started a shell on my Ubuntu setup. From this, I tried

sudo upgrade-grub2

That just updated GRUB’s entries, not the EFI one, so didn’t fix the boot.

Then I tried

sudo grub-install

If I remember well, no arguments were necessary, and that fixed my GRUB EFI and added back the Ubuntu entry to NVRAM. This worked only after /boot/efi was correctly referring to my ESP. Note however that for this to work fully, the USB live Ubuntu had to be booted in UEFI mode, not MBR default mode.

A reboot later, I was starting my Ubuntu setup, fully intact and working! Half of the transfer done! Not quite…

Windows was failing to boot and Ubuntu’s update-grub wasn’t detecting Windows anymore. Quite bad.

Windows: desperately dead

Windows, on the other hand, wasn’t booting at all. It was showing a blue screen suggesting me to use the repair tools from Windows DVD. Last time I did this, the tools ran for at least one minute and bailed out, so I had to do a complete refresh which ended up wiping everything and leaving only applications from the Windows store. If I have to choose between such a messed-up repair and a clean install, I would bet for the second option.

Before entering into this reinstall nightmare once again, I tried to recover the reserved partition. For this, I plugged back my Windows 120Gb SSD and booted from my live USB stick to make sure Windows would not kick in and see two copies of itself (one on the old, one on the new SSD). If Windows sees two copies of itself, it changes the disk ID of one copy. If the new drive is changed, everything is messed up and Windows cannot boot anymore, until a refresh is done (and then everything is messed up again!). Back to my live USB, I used DD to transfer the bytes of the old reserved partition to the new one. I also made sure the new /dev/sda2 reserved partition was marked as such in GParted, by modifying the flags. That changed nothing.

The post How to repair the EFI Bootloader in Windows 8 literally saved me hours of work! This gives a procedure that allows to fix the boot loader. The main idea is to log into console from Windows DVD and run bootrec /fixboot command from directory EFI\Microsoft\Boot\ of the ESP, followed by bcdboot  with a couple of arguments, again from the ESP. Luckily, I had my ultrabook, which was quite handy to check the page while I was running the commands on my primary PC.

That solved the issue and allowed me to boot into Windows 8.1! PHEW! Quite a nice step forward.

GRUB not detecting Windows

Now that my machine was able to boot into both Windows and Linux, one could wonder what was missing. Well, I had no easy way to choose which operating system to boot at startup. Originally, GRUB was offering me an option to boot into Windows or Ubuntu. After the transfer, it was only seeing Ubuntu.

I found procedures to manually add an entry for Windows but that involved finding and copy/pasting drive UUID and probably redoing the change on each kernel update. I didn’t want that. Another possibility was to install an alternative EFI boot loader like rEFInd, but these have tendency to display many unwanted icons doing nothing. I got enough trouble with this while fiddling with triple boot (Windows, Linux, Mac OS X).

There was absolutely no way out. People were doing the manual addition of Windows or that was working out of the box. I had to spend more than 45 minutes inspecting the os-prober script and walking through it! By looking at the script and its logs in /var/log/syslog, I manage to find out it was skipping my ESP because the partition was not flagged as Boot! I fixed that from GParted, reran sudo update-grub and tada! GRUB was seeing Windows!

This is NOT the end!

Then I had to proceed with the hardware installation of the new drive. Since I was too impatient to get a SSD, I ended up with an ill-designed system. If I had waited another year before purchasing my Core i7 PC, I would have got a superb case with support for SSD drives. Now I have a CoolerMaster kind of case with only standard 3.5″ drive bays and need to fiddle with SSD brackets. Screwing the SSD drive in this is a painful process of trial and error. Then the assembly doesn’t fit well with the screwless mechanism of the case. This somewhat holds in place, but that’s not smooth installation like a regular 3.5″ drive.

Some more fiddling later, my new SSD was plugged back into my PSU and motherboard, and I got rid of the extra two SATA cables. I stored them away; they will be useful sooner than later, because my two 120Gb SSD won’t remain unused.

I plan to put one of them into my HTPC, which will be another adventure of its own. My HTPC has only four SATA ports, all used up, so I will have to get rid of one hard drive.


Bumpy Android upgrade

I recently joined the club of unfortunate owners of Galaxy Nexus that reached the down path of death. Many people told me bad things about these Nexus and about other Android smartphones in general. My brother’s device is slow and for some obscure reason, mixed up the sounds altogether. As an example, the device emits the sound of a photo camera when locked and unlocked! My sister’s phone is slow like hell, putting her to the torture each time she opens up an application. One of my friend’s phone has no more mic; he has to leave headphones plugged all the times to answer calls. Another colleague at my work place had issues with the USB port: device was not charging anymore.

My problem is sporadic reboots, several times a day, and sometimes boot loops. I thought my phone was agonizing, but I found something that may give it a second life. I will have to see in the long run, but this was nevertheless an interesting adventure.

The symptoms of my Galaxy Nexus

This started a few months ago, on Thursday March 27, 2014. The phone entered into a boot loop and could not do anything other than rebooting like crazy. One of my colleague and friend managed to remove some applications in a hurry, before the next reboot, and that seemed to stabilize the monkey for a few minutes, but that just increased the length of the boot cycles. The device was rebooting like an old agonizing 486 computer overloaded with Windows 98! As a last resort, I tried a factory reset, which helped… until last week. Yes, the device started to reboot again!

I woke up on Thursday, July 24 2014, and noticed that my phone was stuck on the Google logo. Nothing would get it unblocked, except removing the battery and putting it back. I did it, rebooted the device and it got stuck again. Argghhhh!!! I removed the battery once more, left the device and battery on my desk and searched for some solution, to no avail, except in some cases, a bug in Android 4.2 was causing the phone to boot loop and it would unstuck after a few attempts. I put the battery back and tried again: this worked. Maybe removing the battery for a few minutes discharged some condensers and reset the hardware to a cleaner state, maybe I was lucky, maybe both. But the device remained unstable and was prone to reboot, sometimes twice in an hour. The Sunday after, I got fed up and made a factory reset, then I didn’t install any application until I find something longer term to fix the issue. The device then worked without any reboot, so an hardware defect is less likely, although still possible. I need to keep in mind I dropped the phone a couple of times, including once on my outdoor concrete balcony.

That means at least one installed application is interfering with the OS and causing it to reboot! This is unacceptable in a Linux environment where each process should be well isolated from the others and from the critical system components. A process should not have the possibility to reboot the device, unless it runs as root, but my device was not rooted, so no installed application could run a root process! That lead me to the conclusion that something in the OS itself was flawed, opening an exploit that can be used intentionally or not by applications to harm the device!

An average user cannot do much about that, other than refraining from installing any application, factory resetting the phone every now and then or contacting his phone service provider and getting whatever cheap replacement the provider will be kind enough to grant him until the end of his agreement. I didn’t want to hit the same wall as my brother and get something with a smaller display and bloated with branded applications. If I really have to get a new phone, that will be a Nexus free of crapware or, if I cannot get a Nexus, I am more and more ready to take a deep breath, give up on whatever I will need to give up and go for an iPhone.

First upgrade attempt: not so good

However, I had the power and will to do something more about this! This was a bit unfortunate for my spare time, my level of stress and maybe my device and warranty, but I felt I had to try it. If the OS has a flaw, why can’t I upgrade it to get rid of the flaw and go past this issue? Well, all Galaxy Nexus are not equal. US models have the Yakju firmware from Google, but Canadian models have a special firmware from Samsung instead! The Google firmware is the one that gets updated more often, up to Android 4.3. Samsung’s philosophy differs from Google: if you want to get an upgraded Android version, replace your phone.

That lead me to the next logical step: can I flash the Yakju firmware on my Canadian Galaxy Nexus phone? Any phone provider, any reseller, any technical support guy, will tell you no, but  searches on Google will tell you YES! For example, How to: Flash your Galaxy Nexus Takju or Yakju To Android 4.3 is the guide I started from.

First thing I had to do was to install Google’s Android SDK on my Windows 8.1 PC. Yep, you need the full blown SDK! The simplest solution is to get the Eclipse+SDK bundle, so at least you don’t have to mess around with the SDK Manager to get the full thing. Then I had to set up my PATH environment variable to get tools and platform-tools subdirectory into my path, so adb and fastboot would be accessible from the command line. I also had to download the Yakju firmware from Factory images for Nexus devices.

Second step is easy to forget when recalling the exact sequence I performed to reach my goal. It is as simple as plugging the phone into a USB port of a computer. That requires a USB cable and, of course, a free USB port. Any port will do, given it works. In doubt, test with a simple USB key.

Next step was to put my device in USB debugging mode. I searched and searched for developer options to no avail! Googling around, I found Android 4.2 Developer Mode.  Bottom line, I had to go into phone’s settings, tap on About Phone, then tap seven times on the Build Number! This is just shocking crazy: how was I supposed to find this out? Fortunately, after I unlocked the developer mode options, I was able to turn on USB debugging. Without USB debugging, ADB cannot communicate with the device.

This was necessary for a simple and nevertheless crucial step: running adb reboot bootloader. This reboots the device into the boot loader, a kind of minimal OS from which it is possible to flash stuff on the device’s memory. I read about procedures involving pressing power and volume up/down buttons, but that never worked for me. This is probably like booting the iPhone into DFU required to jailbreak or recover from very nasty failures: you have to watch tens of videos, try it fifty times and get it by luck once in a while. These kinds of patience games are getting on my nerves and making me mad enough to throw the phone away. Fortunately, adb reboot bootloader while device was plugged into my computer and in USB debugging mode did the trick.

Once in the bootloader, you can use Fastboot to interact with the minimal OS. As ADB, Fastboot comes with the Android SDK. However, Fastboot wasn’t working for me: I was stuck at “Waiting for device” prompt. I started Googling again and found awful things about a driver to download from obscure places and install, the driver may differ for Samsung devices with respect to other Nexus phones, I read upsetting stuff about driver not working for Windows 8 without a complicated tweak to disable driver signature validation, about rootkits that could simplify my life if I install yet another hundred of megabytes of applications onto my PC, etc. Flooded with all of this, I gave up and just let my phone run as is. Getting out of the bootloader is easy: just hit the power button and the phone will reboot as normal.

The Penguin saved the deal!

However, one week later, an idea was born in my mind, and it was urging me to be tested! Linux may have the needed driver builtin so it would be worth trying from my Ubuntu box. That’s what I did on Friday evening, August 1 2014, and it was a success after a couple of hurdles.

First, I had to install Android SDK there as well. Once adb and fastboot were accessible, I switched my phone into bootloader once again, using adb reboot bootloader.  Then I tried fastboot devices to get, again, this stupid “Waiting for devices” message. I don’t know exactly how I got to that point, but that command finally output me a message about permission denied. Ok, now I know what to do! sudo fastboot devices. Well no, cannot find fastboot! I had to stick the absolute path of fastboot for it to work, but I finally got a device ID. Yeah, the data path between my Ubuntu box and my phone was established!

Next incantation: sudo fastboot flash bootloader bootloader-maguro-primemd04.img. That gave me a failure, AGAIN! Ok, that’s great, my phone will definitely not accept commands from Fastboot! Maybe it is factory locked to deny these? But before thinking too much, I should have read the error message more carefully and completely. It was saying the following:

FAILED (remote: Bootloader Locked - Use "fastboot oem unlock" to Unlock)

It even gave the incantation needed to go one step further. I thus ran the command, prefixed with sudo. That popped a message on the phone’s screen asking me for confirmation. I moved the cursor to Yes with the volume up/down buttons, pressed power button and voilà, boot loader unlocked!

Why did I have to unlock the boot loader? This was probably because I was switching to a different kind of firmware. If I had a US phone, probably I would be able to install Yakju without unlocking the boot loader. The unlock operation is not without consequences: it wipes out all data on the device! This was a minor issue at this stage, since I refrained from installing anything and extensive configuration until I would find a way to improve the stability of my device. I thus wiped without asking myself any question about important data to back up.

Then with the similar feeling as a wizard gathering all the components to cast a spell, I entered the following command and looked at the output.

eric@Drake:/media/data/yakju$ sudo ~/android-sdk-linux/platform-tools/fastboot flash bootloader bootloader-maguro-primemd04.img 
sending 'bootloader' (2308 KB)...
OKAY [  0.258s]
writing 'bootloader'...
OKAY [  0.277s]
finished. total time: 0.535s

Victory! Not really… That was just the first step! Next step was to reboot the device, using sudo fastboot reboot-bootloader. My phone screen went black for a couple of seconds, enough to fear for an heart attack, then the boot loader came back again! Phew!

Ok, now the radio: sudo fastboot flash radio radio-maguro-i9250xxlj1.img. That went well, similar to the boot loader. Then I had to reboot again: sudo fastboot reboot-bootloader.

Now the main thing: sudo fastboot -w update That took almost two minutes, then my device rebooted automatically, this time in the new firmware. Done!

After these manipulations, I was able to set up my phone normally. Once in the Android main screen, I accessed the phone settings and confirmed I was now on Android 4.3! At least I reached my goal.

What can I do next?

There are a couple of things I will try if the device starts rebooting again. Here they are.

  1. Install a custom ROM providing Android 4.4. Besides upgrade to latest Android, this will give me an extended battery life as 4.4 greatly improved over this as I experienced with my tablet, which benefited from a custom 4.4 ROM recently. I will also be able to return to baseline Yakju 4.3 if needed. Unfortunately, I had no way to back up my 4.2 firmware, so I cannot go back.
  2. Shop for a new phone. I will try to get a Nexus 5 and if I cannot without switching provider, I will shop for an iPhone. Maybe I will find a store in Montreal providing unlocked phones including Nexus, maybe I will have to wait patiently for my next trip to United States to buy an unlocked Nexus 5 there, maybe I will be able to convince someone from a US office of my company to buy the phone for me and ship it to me (if I ship him a check with the amount of the device obviously!), maybe I will find something to make me happy on a web site I don’t know about yet. We’ll see.
  3. If all else fails, I will give up on installing any application and will use the Galaxy Nexus just as a phone and for casual Internet access with the stock browser. After my agreement with Fido ends next November, I will consider other, hopefully better, options.



Issues with NoMachine’s NX Client

I recently tried using NoMachine‘s NX Client to connect to a virtual machine at work running NX Server, and got an incredible number of problems. At the end, I gave up on NX and fell back to VNC, but this exploration is nevertheless interesting.

The virtual machine at my work place is running NX Server 3.4.0-12, probably the free edition. I have no control over this. However, I can control on which NX client I run.

I got issues with the screen resolution on Windows 8, erratic keyboard responses and ended up switching to VNC after I couldn’t find any solution under Windows 8.

Awful screen resolution on Windows 8

When I was connecting to NX server using NX Client 3 or 4 using my main corporate laptop running Windows 7, I got no display problem. The desktop showed up in a nice 1674×954 resolution, which is near the native 1680×1050 I have with the 22″ LCD out there. I can bump up the resolution closer to native by switching my NX client to full screen.

However, I got a secondary ultrabook running Windows 8.1. Because it is lightweight, I like to use it when working from home. However, running NX client on this machine causes a major issue: display resolution goes down to 1114×634! I tried searching for a solution or at least a workaround to no avail. Nobody seems to be having this issue, and there is a very good reason why.

Because I am visually impaired, I need larger fonts and mouse pointer. There is a very neat way to get this under Windows since 7: the DPI scaling. It can be adjusted by right-clicking on the Desktop, accessing Personnalize and clicking on Display. I use to bump up the scaling to 150% which makes fonts large enough for most cases and also enlarges mouse pointer. This doesn’t completely remove the need for application-specific tweaking, but this at least helps greatly. Magnification is done without lowering screen resolution, by just applying a scaling before rendering the graphical elements, as opposed to scale bitmaps after the fact as the Windows built-in and third-party zooming applications do.

Under Windows 7, this functionality doesn’t affect NX client at all. It gets the display resolution and can make use of all pixels. Under Windows 8, it seems to get some form of virtual resolution sensitive to the scaling! Let’s do the math real quick. If we divide 1680 by 1.5, we get 1120. Dividing 1050 by 1.5 yields 700. Isn’t that near 1114×634?

This is the third application misbehaving with DPI scaling for me. First one has been Ableton’s Live virtual studio. Second one has been Corel’s VideoStudio, a video editor. In both cases, I was able to turn off DPI scaling locally for the offending application. This is easy for 32 bits applications: just right click on it, access the properties, then the compatibility tab, and there is a check box for this. For 64-bits applications, this is trickier and would deserve its own post since it involves registry hacking. But this is doable, because this is how I worked out Ableton’s Live. I have a post in French about this, called Le problème des caractères trop petits sous Windows.

However, this is miserably failing for NX Client. I tried applying the compatibility setting on each and every executable found in the installation directory, including NXWin.exe which is responsible for showing the desktop, NXSSH.exe which establishes the SSH connection, NXClient.exe which is the frontend offering the configuration panel, etc. I tried upgrading from NX Client 3 (recommended by my company since NX Server 3 is in use) to NX Client 4 with absolutely no change.

There is only ONE workaround I could find: completely disable, session-wide, the DPI scaling, decreasing it from 150% down to the regular 100%. However, I just cannot work efficiently this way. Although I can bump up font size individually for elements, the mouse pointer will remain desperately small, even when I use the Magnified pointer. I couldn’t find any convincing solution, although I got a couple of proposed ideas that I will list there for the sake of completeness.

  1. Start a Remote Desktop Connection from my ultrabook to my corporate laptop, or even from my home PC (that would free me completely from transporting a laptop between office and home). This is an attracting solution used by many colleagues. I tried it and it unfortunately failed because Virtuawin didn’t work at all on the remote desktop. Well, it worked, but with no keyboard shortcuts to switch between desktops! Some colleagues got past this by changing keys of Virtuawin. However, if I start the remote desktop connection while no session is active on the laptop, font sizes and mouse pointer are small as if there were no DPI scaling. I couldn’t find any solution for this second issue.
  2. Run NX client from my home computer instead of my company’s ultrabook. This machine is a dual-boot system with Ubuntu 14.04 and Windows 8.1. I could run NX Client from Ubuntu (version 4, because NX Client 3 for Linux is not available for free anymore on NoMachine’s web site and my company offers the binaries of NX Client 3 only for Windows and Mac), I managed to setup the VPN connection, so that is doable. However, I heard that the VPN connection is unstable under Linux. Yes, I could work around by establishing a SSH tunnel, letting my Windows ultrabook manage VPN and Linux manage NX. But connection to Microsoft’s Lync will cause difficulty, especially for voice chat. I will also have no option for Outlook other than using the web mail which lacks keyboard shortcuts, or switching back and forth like crazy between the ultrabook and the Linux box! Even with an ideal dual monitor setup, with one screen for the ultrabook and a second screen for the home PC, how would I copy/paste contents between the two? I don’t know yet.
  3. How about running NX Client on the Windows side of my personal PC? Yes, it would be possible. I could even go as far and crazy as purchasing a license of Microsoft’s Office to get Outlook set up, Lync could work as a charm, some colleagues got it set up, but my PC runs Windows 8.1, so back at square 1!
  4. Downgrade to Windows 7 or purchase a separate cheap PC with Windows 7, and install a setup on this. Well, that’s possible, but I am still left with no good solution for Outlook, unless I purchase a license that will be locked to that cheap PC. If I have to purchase Office, I would like to make use of it on my main computer, to maximize my investment!
  5. Wipe the ultrabook and install Linux on it. Well, the machine is intended to be used for testing a Windows application developed by my company, so I cannot just thrash Windows and toss Linux in blindly. I would also be in a similar situation to my personal PC running Linux: partial Lync with no or brittle voice chat, no convenient access to my Outlook mail.
  6. Install VirtualBox on the ultrabook and set up Linux on a virtual machine. I tried it, it almost worked on my personal computer, but it failed when transferring the VirtualBox setup to my ultrabook. Symantec Endpoint Protection, used by my company, screws up VirtualBox’s executable, making it complain it cannot run. I tried to add exceptions covering every executable in VirtualBox’s directory to no avail. It seems that this requires changes to the profile policy using a management application I don’t have. Since VirtualBox is also causing random erratic behaviors, like Alt Gr sometimes not working, Alt-tab sometimes switching out of the VM, etc., I gave up on this. If I had to continue exploring this path, I would try exporting the XML profile of SEP, altering it and reimporting, similar to what I did to get rid of the password protection that was preventing me from uninstalling and reinstalling after upgrade to Windows 8.1 broke it.
  7. Use something else than NX. I tried to find an open source NX client that would have less problem: no go. I really have to give up on NX completely. There are two main alternatives to NX: VNC or plain X using a server such as Cygwin/X or Xming. Because of other problems I will cover below, I finally fell back to VNC which is working better although a bit sluggish.

Update August 5 2014: yesterday, I tried again and my Windows 8 company’s ultrabook got connected using the NX Client 3 with almost native screen resolution. Putting the client in full screen bumped the resolution to native 1680×1050! Probably some NX processes were kept running in the background and a reboot was necessary for the disabling of DPI scaling to take effect. On August 5, during the evening, I tested the NX Client on my personal Windows 8 computer. I got the previous low resolution. I then disabled DPI scaling for the following executables in the C:\Program Files (x86)\NX Client for Windows and its bin subdirectory: nxclient.exe, nxauth.exe, nxesd.exe, nxfind.exe, nxkill.exe, nxservice.exe, nxssh.exe, and NXWin.exe. I’m not sure all these are necessary to change, but I did them all. Then a reboot later, I was getting the native resolution on my personal computer as well. So the fix is reproducible!

Two versions, two protocols

My company is using NX 3 while the newest version is 4. In theory, this shouldn’t be a problem as it should be possible to download previous versions or, even better, use newest client with previous server. In practice, this is not exactly the case. First, only NX 4 can be obtained for free from NoMachine. Getting previous versions requires to be a registered customer. My company is providing binaries for NX 3, but only for Windows and Mac OS X. This caused me additional difficulties to test things in Ubuntu.

By default, NX client 4 won’t work with NX server 3, but it is perfectly possible to configure it to work by doing as follows.

When starting the client for the first time, a screen similar to the one displayed below shows up and needs to be dismissed by clicking on the Continue button.


That leads to a second window similar to what follows. That one needs to be dismissed as well using the Continue button.


Then you end up at the main screen where connections can be added, removed or edited. This looks like the image below.


Click on the New button to create a new connection. This pops up a window similar to below.


The first important setting is the protocol; it needs to be SSH, not NX. This is probably because NX 3 was working on top of SSH while NX 4 has its own TCP/IP protocol. Anyway, selecting SSH is necessary for the connection to work. After that, click Continue to go to the next step: a screen allowing to enter the host name of the NX server.


On the next screen (after clicking Continue), you need to perform an additional setting: set the connection type to “Use the NoMachine login”.. The system login seems to work only with NX 4.


Leave the next two screens as they are.



Then you have the opportunity to name the connection. This is just for convenience; this doesn’t affect the ability to connect at all.


After all these steps, you end up at the central menu and can double-click on the newly created connection. You might get a screen similar to the following one. Click Yes to accept the authority of the NX server.


After this you have to enter your regular user name and password.


Then you have to double-click on the virtual desktop you want to create.


Three more screens to dismiss…




At this point, I got some trouble connecting when I tried to update from NX Client 3 to 4 under Windows. The Ubuntu setup worked very well, on the other end. Looking at the logs, I found out errors about cache and had to remove files from an hidden directory I couldn’t access from the Explorer without copy/pasting the name from the log file! The directory wouldn’t show up, even though Explorer is configured to show hidden files for me, and it would not Tab-complete under GNU Emacs.

Almost there, would you say, a bit annoyed. Well, no, that’s not the end of the story as you can see on the image below!


How am I supposed to work with such a small screen? Maybe some sighted people are able to, by making fonts tiny, but that’s not acceptable for me. Moreover, ALT-Tab doesn’t work, switching out of the NX window rather than through windows inside the NX desktop.

Fortunately, there are ways to configure things better. First, hit CTRL-ALT-0 (zero, not o). That leads to a new menu with options not available through the connection preferences.


First click on Input. That leads to a window from which you can check Grab the keyboard input. This makes ALT-Tab working inside NX, with fortunately significant drawbacks covered in the next section. Dismiss by clicking Done.


Then click the Display button. Select Resize remote screen, dismiss with Done.


Dismiss the main window with Done and tada!


Yes, a fully functional GNOME desktop running inside the NX client. Phew! What a ride!

And that’s not the end…

What’s the point of having a keyboard if it doesn’t work?

Well, I asked myself this question countless number of times while working with NX. This is so erratic, so frustrating, that this was getting on my nerves at times. I strongly rely on keyboard shortcuts for my daily work. Without them, I am completely inefficient, spending a significant amount of time and wasting energy searching for my mouse pointer. Until touch screens spread out and are available in 22″ and bigger formats, I will be stuck dealing with the mouse and working around with keyboard shortcuts as much as I can.

Here are the issues I ran into related to the keyboard, under NX client, both 3 and 4 versions.

  1. CTRL-ALT-<arrow keys> doesn’t allow to switch desktop when NX runs under Windows with Virtuawin, a software tool to add multiple desktops lacking under Windows for years. When I was using the keys, Virtuawin was taking over and getting me out of NX, so I could not use multiple desktops inside NX. To enable this, I had to remap the keys of GNOME inside NX. I could as well remap Virtuawin’s keys.
  2. After approximately one week of usage, NX client started to go crazy with the Alt Gr (right Alt) key. I rely on this key to produce many special characters like @, [, ], {, }, etc., because I am using a Canadian French keyboard. Sometimes, the Alt Gr combination was just doing nothing so I had to type it many times (sometimes more than ten) until the character pops up. Sometimes, the session was getting stuck with no keyboard functionality for at least thirty seconds (mouse was continuing to work). With NX 4, things got worse: no more Alt Gr at all! Running xev, I found that the right Alt key was generating Left Control events instead. Workaround? Well, disable Grab keyboard input! But that makes Alt-Tab non-functional. Alt-Tab is one of my most critical keyboard shortcuts! Without it, I have no efficient way to switch between windows! Well, I remapped the key in GNOME to Ctrl-Tab. This was a bit annoying but somewhat working. The problem got worse on August 5 2014, maybe because the virtual machine was running a script performing I/O competing with the bandwith available for NX. It is thus possible that networking issues are dropping some events while the client and server communicate between each other.
  3. With NX 4, sometimes the keyboard stops working altogether and the session locks itself. It is impossible to type any password and very hard to kill the session. I managed to do so by forcibly terminating the NXWin process. Another way is to temporarily turn on Grab keyboard input, type a few characters, then turn it off! This happens at random times, when switching from a Windows application like Outlook or Lync back to NX client. This issue didn’t happen on NX client 3.
  4. On both client versions 3 and 4, the shift key sometimes sticks. The system behaves as if I was pressing and holding shift while I am not. There is no way out except pressing shift repeatedly until it unblocks. Sometimes that requires disconnecting and reconnecting.

So it seems NX is designed for the simplest scenario: US English keyboard with mouse used to switch tasks.

The only workaround I found is pretty heavy weight: install VirtualBox, setup a Ubuntu virtual machine and run NX client from there. It seems that Windows is the one intercepting some keys and not sending them to NX. It could be literally a lot of different applications: Outlook, Lync, Symantec Endpoint Protection, etc. It would be tedious to find the culprit and probably impossible to disable it.

Copy/paste: a random game

Even the simple and common operation of copy/pasting information causes trouble when NX is involved. There are two main types of problems. The first which I found to happen in both NX 3 and NX 4 is the unstability of the clipboard transfer. Sometimes, I copy a block of text into the clipboard and when I try to paste it in another application, nothing happens. It mainly happened when copying from NX and pasting to a Windows application, say Outlook or Lync. Sometimes I am reaching the point of systematically performing the copy operation twice in a row to make sure it will have greater chance to succeed!

Sometimes, after I selected an area to copy, after one second or two, it automatically unselects. If I try to select the area a second time, it unselects again. When that happens, I have to select and very quickly right-click on the area to get access to the Copy command in the contextual menu. This second issue is intermittent and quite annoying. It seems to happen only on CentOS 6.3 virtual machines running GNOME Terminal. I tried to connect to a Ubuntu machine using NX and didn’t get the issue. The problem also didn’t happen when I ran the NX client into VirtualBox, so this might be caused by Windows or some other application.


After the second time NX Client 4 went south with the keyboard, leaving me locked out of my session, I got tired and tried something else: VNC. It happened to be simpler than I thought. I just had to SSH into my virtual machine and type vncserver from there. I had to set up a password for my VNC connection and VNCServer told me the host name and display to use on the VNC Viewer.

I tried with UltraVNC, because that viewer is supporting encrypted connection between client and server. Connection worked like a charm, but keyboard support is quite poor. First, Right Alt, again, is failing. It seemed more and more I would have to switch to US English keyboard to work with that virtual machine. Then I noticed the VNC Viewer was skipping keys randomly. For example, I typed exit and got just the e! So I would have, on each key press, stop and look at the screen if it’s there, and retype it, as many times as I need. I am not used to working like this: I rely on the key presses to work. This is a shortcut that prevents me from getting completely drained out quickly!

After a very frustrating and almost catastrophic failure with the installation of VirtualBox on my company’s ultrabook (the beast didn’t like Symantec Endpoint Protection and messed up with my Internet connection), I tried a second VNC client: TightVNC. It does not encrypt the traffic, but that’s not a big issue since the virtual machine is inside the company’s network and thus access through encrypted VPN. That one worked relatively great with a couple of tweaks and two drawbacks.

Here are the tweaks:

  1. TightVNC is misbehaving under Windows 8 in a way similar to NX Client. However, and fortunately for me, that one can be worked around by disabling DPI scaling just for TightVNC.
  2. Under GNOME, I had to use XRandR to switch resolution, by typing xrandr -s 1680×1050 on a terminal.
  3. I had to switch VNC to full screen using CTRL-ALT-SHIFT-F otherwise some portions of the screen are cut away.

But in full screen, TightVNC completely captures ALT-TAB and CTRL-ALT-<arrow keys>! I have to leave the full screen mode to get the keys back to Windows. Right alt key is also working great. This is very nice, as if I were working under Linux! However,

  1. Performance is not as great as NX. The display is a bit sluggish, although manageable, especially given the incredible benefits I got with Alt-Tab working. However, sometimes, especially when I am working from home, the display refresh becomes very slow and when typing a key, the character appears one second later. In one case, it was so slow that I had to figure out a way to work locally for the rest of the day.
  2. Clipboard support is a bit clunky. I managed to transfer data from VNC to Windows by starting vncconfig -nowin inside my VNC session, but that doesn’t solve the other way round: I cannot transfer data from Windows applications to VNC-managed GNOME session. I couldn’t find any solution for this.

If everything else fails

If VNC fails too at the end, there is little remaining other than establish a traditional SSH connection and working from the terminal. I will need to open a new window and SSH again each time I want a new terminal. I tried to use Screen to have multiple terminals in the same window. That works relatively well except under Emacs because the Ctrl-a key used in the editor conflicts with Screen.

Moreover, Emacs somewhat misbehaves under SSH terminal, at least with Cygwin. Selecting text and typing a key doesn’t overwrite the text, just adds characters besides as if no selection was done. Moreover, Ctrl-Backspace erases a full line rather than the last word.

If I need a graphical display, for any reason, I could start a X server such as Cygwin/X or Xming, run export DISPLAY=:0 and use the -X option to start SSH. With this trick, any X client shows contents on the Windows screen. However, this is pretty slow over a VPN connection. At least, this is a known-good solution. This will almost always work!

Can we make things better?

If the server evolves, yes. There are at least two ways I could think of that would improve things.

  1. X protocol is quite old and clunky. There are new protocols in development that could replace it and be more efficient and compact over VPN. The main one is called Wayland, a completely documented and open protocol. Canonical, the maintainer of the Ubuntu distribution, is also developing its own alternative called Mir. As far as I read, Mir is simpler than Wayland, but it is more closed; only the API is open while the protocol is under Canonical’s exclusive control. Using either Wayland or Mir may result in less traffic, so more efficient graphical sessions over the same network bandwith.
  2. Logging on a centralized server and working from there is the 70’s way! Nowadays, each and every laptop has tremendous CPU power that is shockingly unused when logging in to a remote desktop. What we need here instead is a way to share filesystems, and there are numerous protocols for this: SSHFS, WebDAV, CIFS, etc. One argument against this is probably data security. That would need to be addressed by encrypting hard drives of the laptops mounting file systems with data. Moreover, some work may require the use of Linux, either on bare metal (dual boot laptop) or as virtual machines. The company could provide a prefabricated OS image employees would install and would be free to alter as they see fit, to install new applications or make settings.