Spurious mail delivery errors

A few weeks ago, I started to receive email containing error messages about the delivery of some mails I didn’t send. The contents of such emails looked like spam, but why weren’t they detected by the anti-spam functionality of GMail? Maybe spammers found a new way to send their junk that circumvents current filters. But few weeks later, the annoyance persisted. I was receiving at least one of these emails per day, sometimes several per day. I started to suspect some people hacked into my GMail account and were using it to send spam, but I couldn’t find any trace of these in my “Sent” folder. Maybe they can circumvent it as well. Will I have to change my password just in case? And what would  tell me they wouldn’t hack again?

Friday, April 14th, I got fed up of this. First, do these come from the same sender or group of senders? If it does, I could block these addresses. Otherwise, there is a problem with GMail that would need to be solved eventually, otherwise I would have to switch from GMail to some other email service. Looking at the sender’s address, I found out the message was coming from something @ericbuist.com. Could it be because my mail account from my Web host was misconfigured?

I logged onto my HostPapa cPanel and reached the mail options. I found out that anything @ericbuist.com not corresponding to a valid email account is sent to a default email address. As a result, spammers in need of a fake origin email address can take anything @ericbuist.com in the hope this won’t correspond to a valid address. I thus reconfigured the default route to return an error email instead of redirecting the message. I also found out that besides redirecting traffic to my GMail account, the HostPapa mail service is keeping a copy of the messages. I thus had 250Mb of junk emails there that I deleted to free space. Although the disk space is unlimited on my HostPapa, if every customer abuses it by leaving junk on their account, HostPapa will have to impose quotas at some point.

I didn’t receive other emails about mail delivery failures after that. Unfortunately, this is not the only cause of such problems. Other people had issues with that because they forwarded all their GMail emails to a service sending SMS, and the service went down. They had to disable that forwarding from their GMail accounts. Things get worse when other email addresses are redirected to a central email account. All these can be the cause of spurious emails and thus need to be checked in case of issues.

Bumpy Ableton Live session

Yesterday, I tried upgrading to latest Ableton’s Live, the 9.7.1 version. Everything went well, but I got other issues, not related to Live, that made my work session quite bad and frustrating.

S/PDIF not working great

A month ago, I got a new audio interface: the Focusrite’s Scarlett 18i20. This amazing device provides eight analog audio inputs and 10 outputs. This is far from the advertised 18 inputs and 20 outputs, but these include S/PDIF and an add-on card that plugs into the optical ports of the interface. Anyway, 8 inputs is more than enough for my needs. I have difficulty playing one instrument reliably, so I won’t start playing multiple instruments at the same time, at least not now!

I didn’t have enough long audio jack cables to plug my Novation’s Ultranova (two channels), my Korg’s EMX (two channels) and my Nord’s Drum (1 one channel), so I decided to try hooking my Ultranova through S/PDIF instead. For this, I used a RCA cable I had got somewhere I don’t remember. I plugged the S/PDIF coaxial output of the synthesizer to the appropriate input of the audio interface, then fiddled with MixControl to figure out HOW to enable S/PDIF. Easy, I thought: just set up one entry in the Mix 1 to route S/PDIF L to left channel and S/PDIF R to right channel. The Mix 1 mix was already routed to the two monitor outputs of the interface. With that, I should have obtained sound from my Ultranova into my audio monitors. No, nothing! I verified that the S/PDIF output was enabled from my Ultranova: it was.

I tried, checked many times, searched on the Web, ok, set the sync source to S/PDIF instead of Internal, from MixControl. Did it, no result. I spent at least half an hour trying, checking, trying again, to find that the volume of my Ultranova was turned all the way to minimum. Turning up the volume solved it!

BUT I started to hear cracking sounds from time to time. This happens especially when playing long notes with pad-style sounds. That means S/PDIF doesn’t work well out of my Ultranova, in my audio interface, or that requires a special cable I don’t have. But then WHY is the S/PDIF the exact same shape as an RCA connector?

There is no solution for the moment, except using the analog jacks and not being able to plug my EMX, Ultranova and Drum at the same time.

Jumpy mouse

While trying to work with Ableton’s Live and the MixControl, I had to cope with too small fonts all the times. I ended up using Windows zoom (Windows key plus +). But regularly, the zoom was jumping all around. I figured out that this was the mouse pointer that was regularly moving around without obvious reason. Ah, this is why I am now literally constantly loosing the pointer, forced to bring it back at upper left corner of the screen almost each time I want to click on something! The pointer is really jumping around, I’m not getting crazy! This made working with the mouse a real pain, similar to what I experienced with the old Mac my brother’s wife gave me a year ago. I thought about running Live on that Mac, because many people pretend that Mac’s are more stable for music production, but the machine is way way way too slow for that, I just forgot and never tried!

I ended up trying with another mouse, that seemed to be a bit better, but I realized that the right button was completely non-working!!! Why the hell did I keep this stupid mouse then? I threw it in the thrash can and put back the first one. Then I figured out that putting the mouse on a piece of white paper helped, making it a lot less jumpy.

Windows update restarting computer while I’m using it

Windows 10 sometimes automatically restarts the computer to apply some updates. Up to now, this only happened while the machine was idle. Well yesterday, it happened right in my face, while I was working with Live! I got so pissed off by this that I tried to disable this really bad functionality. I fortunately figured out a way to disable these forced updates. It was relatively easy, although it caused me trouble because my Windows is in French and the procedure was in English. If this procedure doesn’t work and spurious reboots happen too often, this may force me to downgrade to Windows 8 or Windows 7, or switch to Mac and have constant trouble with too small fonts. This could be a dead end case leading me to stop using my computer, at least stop trying to make music with this.

Slower and slower machine

My main computer is on a desk while my music gears are on a table on the opposite wall. I tried to link them together using a long USB cable and a hub, but that failed with crashes from Ableton’s Live. However, my attempts were with the audio interface built into my Ultranova. Maybe I’ll have more luck with my Focusrite, if the cable and hub are stable enough. Why an hub? Well, this is to get a keyboard and mouse next to my music table. I will also transport video through an HDMI cable and get a screen nearby as well.

But for now, I ended up having to use my Lenovo’s IdeaPad Yoga 13 ultrabook for attempts at music production. This worked relatively well, but the machine is starting to be slow since I updated it to Windows 10. Searching on forums gives no result, except other people are experiencing performance problems, sometimes on Windows 10, sometimes on Windows 8.1. Starting Live is now taking almost 45 seconds on this machine. Fortunately, the program is responding correctly for now, until of course I add enough tracks and effects to my Live set to make it choke up like crazy. I guess this will happen if I go far enough in music production.

Difficulties with music production itself

Creating the track I had in mind caused me great trouble. While not super complex, it is not a trivial repeat drum beat. I managed to play it a couple of times, started the recording on my EMX and messed it up completely. I tried again, messed it up again. I cannot play it reliably unless I try 25 times and more. The workaround is to correct notes, but this is quite tedious on the EMX. Tired of this, I tried to record MIDI using my Ultranova as a source and Live as a sequencer. But even from Live, fixing the incorrect notes was a real pain. I experimented with the quantization which also didn’t work correctly.

There is no well-defined workflows and no comprehensive tutorials about music production. All I can find is case-specific pro tips, sometimes involving plugins I don’t want to install yet. I’m just overwhelmed with Live itself, having to constantly check and redo what I am doing, this is not a great time to complicate stuff with plugins.


Although I am having less and less fun with all this for the moment, I feel I can manage to get something good out of it. If I gave up because of difficulties, I would not have been able to get a Ph.D, to keep my job for more than seven years and to create a modded Minecraft map.

Shocking problem with audio channels

A couple of months ago, I bought myself a condenser microphone to improve the quality of my recordings in my Minecraft videos. However, such microphones require a XLR connection sending phantom power. An audio interface or mixer is required to power such microphones and get captured audio out of them. My first setup was a bit convoluted and required two cables going from my computer desk to the table on which I installed my music production gears:

  1. My microphone is on my computer desk and linked to my mixer with a cable running on the floor.
  2. My mixer is sending phantom power and getting the microphone’s audio. It gets a mono signal and spreads it to its two output channels. From the mixer, it is possible to adjust the microphone’s volume as well as its position in the stereo image.
  3. My mixer is sending output, including microphone and other sound devices, to my Novation’s Ultranova.
  4. My Ultranova is linked to my computer through a USB cable which also has to run on the floor.
  5. The audio interface built into my Ultranova is used to turn analog sound coming from my mixer into digital audio.

After some changes in my home office, I had to move the table with music gears further from my computer desk, which prevented me from using this setup until I get longer cables. I may instead end up with a second computer dedicated to music production, which will make controlling Ableton’s Live easier than having to go back and forth between my music table and computer desk. I then needed a new solution for my microphone setup.

Luckily, I have a M-Audio FastTrack Pro interface I decided to give a new shot. The interface had issues with Ableton’s Live, making the software crash and misbehave intermittently. The issue can come from the interface itself, the ASIO driver, Windows 10, Ableton’s Live or something else. There is no way to track it down, this is why I switched to using my Ultranova as the audio interface. But maybe, I thought, the M-Audio FastTrack Pro would just work for that simpler application.

I thus put it on my computer desk, plugged it through USB, plugged my microphone in the first input and turned it on. I made sure the first input was configured in Instrument mode, turned on phantom power and then performed a test. I had a voice over to record at the end of an in-progress Minecraft video. For this, I usually use Corel’s VideoStudio X8.

However, when I listened at the recording, sound was correct, but it was coming from the left channel only. It didn’t take me long to realize Corel’s VideoStudio was accessing my audio interface as a stereo device. The interface was then simply and predictably providing stereo information: the left channel coming from the first input, right channel coming from the second input. Nothing is plugged in the second input? No problem, the interface just provided silence. This is simple, logical, but today’s software expect more chaotic behavior: VideoStudio was assuming the interface would magically duplicate the two inputs! Apparently, some low-end USB microphone just do that! I also realized that my recording software would react the same way; my voice would play just on the left side.

Searches on Google only gave me unacceptably complicated solutions.

  • Post-process audio in another software tool like CoolEdit Pro, Audacity, Sound Forge, to turn the stereo file into a mono one. That would have forced me to figure out the name VideoStudio gave to my voice over, maybe even exporting the clip manually from VideoStudio to a WAVE file, find the file in the audio editor of my choice, search forever to figure out how to make the file mono, save the file back somewhere, return to VideoStudio, find the file there, import. If I had to do this one time, I would do it and that’s it. But I would have to repeat all that for any voice over I make with that new setup!
  • Encode the video with mono audio. Besides requiring a lot of tedious manipulations in VideoStudio (click there, find that option, click there, there, there, there, etc.), this is unacceptable as my game sound is stereo and I want to keep this.
  • Insert a Y splitter cable linking my microphone to both inputs of my audio interface. That could work in a RCA or jack world, but I’m not sure at all about the results with XLR plugs delivering phantom power! Of course, nobody will have accurate information about that. According to my very far memory of electricity I learned in physics, both XLR inputs would deliver a 48V signal, resulting into a circuit with two parallel paths delivering 48V, so the output of the Y would get 48V, not 96V, but maybe I was wrong, and that would just blow up my microphone. I would also have to order this Y splitter cable on eBay or Addison Électronique and wait for it several days, or go at Addison store, which involves a never-ending bus trip for me.
  • Some forum posts were suggesting that the software tool is responsible for correctly configuring the audio interface. If it doesn’t, I have to switch to something else. That would mean I would have to use one tool to capture video, a second tool to capture audio and manage to sync up the tools in some way or another. That means having the two tools side by side and rapidly clicking on the record buttons, hoping they start simultaneously. That’s stupid, crazy, inefficient, and I really hate that people propose, adopt and accept such solutions, because that’s not so bad for them. This is bad, because computer is all about automation, and should not force human beings to repeat stupid and brain-killing tasks!
  • According to my research, some USB microphones will deliver a stereo signal to Windows, which will just avoid this issue. I could thus switch to such a microphone, forgetting about my actual device. I really disliked that, because I didn’t want to replace an already-working microphone with a potentially inferior one. And what would happen with my actual microphone? Well, maybe my brother would make a use of it in his jamming room. Quite little consolation…
  • Maybe another audio interface would provide a better treatment of this issue. I could for example try with the FastTrack Solo interface which has a single input, so no obvious reason to deliver stereo data. However, I  had no certainty about if and how that would work, I would have had to try my luck. Maybe my brother could help me out if he has the Solo M-Audio interface, maybe not, I didn’t remember which one he had.
  • My friend suggested me to use my mixer as before. That would require me to unplug all wires from my mixer, moving it on my computer desk for recording stuff, then moving and plugging my mixer back on my music table to play some music. Quite annoying.
  • My friend suggested me to use the inserts on the M-Audio interface. This quickly appeared to be an hard task, as making use of this requires custom cables designed for inserts. In particular, I would need a Y splitter starting from a TRS balanced jack into two separate mono jacks! Most jack Y splitters just duplicate a stereo signal. The only TRS Y splitters I could find were on eBay.

I was quite desperate and about to give up on recording or switch back to my H2N, which works but gives recording with a lot of background noise. My last hope was Virtual Audio Cable. Tailoring it to my needs required a bit of trickery, but that ended up working, so I purchased a license for it.

From stereo to mono with Virtual Audio Cable

First piece of this intricate puzzle can be found by right-clicking on the Windows mixer in the task bar and selecting recording devices.

Capture d'écran 2016-08-20 21.27.42Double-clicking on the M-Audio’s line device and accessing the last tab results into the following.

Capture d'écran 2016-08-20 21.27.47The default input setting is on two channels, thus stereo. Interesting. What if I switch this to mono? Wouldn’t this be enough to indicate both VideoStudio and Bandicam to record a mono track? If they simply use default settings, that could work, no? Well, no, because the M-Audio driver doesn’t accept other settings than 2 channels! I tried with both Windows builtin driver and the M-Audio one: same result. I probably need a better audio interface. But that is enough for DAWs such as Ableton’s Live, who are able to pick and choose which channels to record on.

I thus had to implement a patch using a virtual cable. For this, I accessed the second tab of the M-Audio line device which allows to listen to the captured audio. However, instead of feeding the captured audio to the default device as most people would do, I routed it to a virtual device provided by Virtual Audio Cable.

Capture d'écran 2016-08-20 21.32.55That Line 1 entry appears in both playback and recording devices. This is a virtual cable that can be used to transfer audio from one process to another. Based on this reasoning, I found the Line 1 entry in my recording devices and made it the default recording device. In my case, it is called Mic 1 because I messed in the control panel of Virtual Audio Cable, but that’s not necessary.

Capture d'écran 2016-08-20 21.33.13Hoping for a miracle, I double-clicked the virtual recording device, accessed the last tab and clicked on the drop-down menu for channel selection. I was then able to select a 1-channel input!

Capture d'écran 2016-08-20 21.33.19I then tested and that finally worked! Windows “plays” the captured audio into the virtual cable, which coerces it into mono, which can be “recorded” by software programs. After a lot of frustrating research with less and less hope for a solution, I ended up with stereo recording again. I had to purchase the full version of Virtual Audio Cable for this to work without the annoying “Trial” message in my recorded sound, but at least, I didn’t have to wait for a Y splitter cable ordered from eBay or try my luck with USB microphones or new audio interfaces, without being sure it would solve my issue.

Ubuntu 16.04 almost killed my current HTPC setup

Yesterday, I tried to upgrade my HTPC running Ubuntu 14.04 to the new LTS 16.04. That almost went smooth, but some glitches happened at the end and some changes prevented my Minecraft FTB server to start again. The problems are now solved, but I was wondering if I would be able to get this working again.

I had two hopes with this upgrade: get an intermittent awful audio glitch fixed and have the ProjectM visualization work again. From time to time, when I start the playback of a video file, I’m hearing an awful super loud distortion instead of the soundtrack. I then have to restart playback. Usually, that’s enough, sometimes, I have to restart it twice. Fortunately, audio doesn’t go crazy during playback. ProjectM visualization started to fail, I think since Kodi 1.16. It just doesn’t kick in, leaving me a blank screen. At least Kodi doesn’t crash or freeze as some versions of XBMC were doing when ProjectM was unable to access Internet reliably.

CloneZilla failing to start

The week before the upgrade, I wanted to backup the SSD of my HTPC using CloneZilla in case some problems happened. I used an old version I had burned on a CD because I thought this 2009 HTPC wouldn’t boot USB sticks. Well, that old version, although working on my main PC, failed to start on my HTPC. It was simply freezing without any clue of what was happening. Before trying to download the new version and burn it on a CD, I noticed that my external USB hard drive was showing up in the boot up options when pressing F8 at computer startup. I thus tried to boot my CloneZilla USB stick running a more recent version and that worked. I don’t know if my HTPC was always able to boot off USB, maybe this capability got added by a BIOS upgrade. That was a good thing, and allowed me to perform my backup.

Dist-upgrade or clean install?

Several people on forums recommend to perform a clean install, claiming that too much things changed from one version to the other. That may be true in some cases, and that’s probably the safest route, but unfortunately, the clean install doesn’t always detect the drives to mount, requiring time-consuming modifications to /etc/fstab (with copy/pasting of drive UUIDs) and then I would have to figure out what packages were previously installed and reinstall them. I also have a couple of Cron jobs performing automatic backups of my Minecraft worlds that I would need to recreate.

Instead of doing that, I tried to use the Update Manager to perform a dist-upgrade. Unfortunately, by default, the tool won’t go from one LTS to the other. You have to go all the way through 14.10, 15.04, 15.10, then 16.04! Each dist-upgrade would have taken at two hours, making this process a really painful non-sense. Instead, I tried calling update-manager -d and got the option to go from 14.04 to 16.04!

During the installation, I thought that if the power supply of this relatively old system died during the process, the system would probably be unrecoverable, requiring a backup restore or clean install. Aouch! Luckily, no such thing happened.

TeXLive broken

During the dist-upgrade, I got some error messages because the updated TeXLive-related packages couldn’t be configured properly. Why is TeXLive installed on this HTPC? I don’t remember exactly. I don’t need to compile any LaTeX document on this machine so this didn’t seem an issue at all for me. I just asked the installer to ignore the errors and noted down to myself to delete the TeXLive packages after the upgrade to be sure not to run into issues if, for some obscure reasons, I wanted to compile a LaTeX document later on.

Failed dist-upgrade

Unfortunately, the dist-upgrade aborted with an error, no accurate information, just a message telling that the dist-upgrade failed. Argh! The system couldn’t shutdown or reboot anymore, even when running sudo reboot from the command line. I was so frustrated that I considered shutting down this machine, which caused me issues after issues since more than seven years, and never turn it back on again. If I weren’t able to recover from this failure, I could however have restored my CloneZilla image after taking a break from this catastrophic upgrade. In other words, everything wasn’t lost.

I tried pressing the power button a couple of times, the screen became blank and remained blank for a few seconds, then the stupid machine rebooted. At least, the broken Ubuntu installation started up to the GUI. Assuming the main issue was this TeXLive glitch, I opened a Terminal and tried to remove the TeXLive package: sudo apt-get remove texlive. This failed. Apt-get was reporting errors about the TeXLive-related packages that weren’t configured. I tried to remove the package using dpkg, which complained that texlive wasn’t an installed package. I then tried searching for the packages using apt-cache pkgnames tex, and ended up removing tex-commons. That got rid of the incorrectly configured packages and unblocked apt-get.

After this, I ran apt-get update, then apt-get dist-upgrade. That installed a couple of additional packages. Then I ran apt-get autoremove to remove the obsolete packages. This, hopefully, completed the dist-upgrade. I also rebooted to make sure the system could still boot after that.

OpenJDK 8 causing issues

This HTPC is running a Minecraft world my friend and I are sharing. We log less and less often onto that map because my friend plays rarely and I am currently focusing on Agrarian Skies 2 rather than this old FTB Monster pack the map runs on. But I I am considering the possibility of starting a map on FTB Infinity Expert Skyblock pack after I’m done (or completely blocked) with Agrarian Skies 2 and would like to run it on a server with an auto-backup strategy in place and the possibility for friends to join in if they want. I thus wanted to keep the possibility of running Minecraft servers on my HTPC.

Now, when I started the FTB Monster server, I was greeted with a meaningless ConcurrentModificationException. I may be able to retrieve the stack trace, but this is a bit pointless, referring repeatedly to non-sense internal class names. Ok, this is probably broken because of Java 8 and won’t get fixed unless I upgrade the mod pack, which will either force me to start from scratch on a new map, or require hours and hours of work to convert the map, and the map would be quite damaged after the upgrade. In particular, switch to Applied Energistics 2 mod will destroy my logistic network so much that it will require a complete redesign and rebuild. This will be even worse than the switch of Thermal Expansion and IC2 that occurred when I migrated (painfully) from Unleashed to Monster.

Simple solution: run this under OpenJDK 7. That’s simple under Windows, unfortunately… Yep, no available OpenJDK 7 package on apt-get for Ubuntu 16.04! Maybe I could have fiddled something with PPAs or install Oracle’s JDK outside of the apt-get packaging system, but what’s the point of having a packaging system if it requires so many workarounds? I also thought about running the server into a Docker container constructed from an image proposing Java 7, but that’s a bit convoluted and could cause other issues. Who knows if the server will behave well when running in a Docker container? It will probably, but that remains to be tested.

Fortunately, I figured out a way to patch the installation by adding a new JAR to the mods folder. The JAR comes from http://ftb.cursecdn.com/FTB2/maven/net/minecraftforge/lex/legacyjavafixer/1.0/legacyjavafixer-1.0.jar and was recommended by a forum post on http://support.feed-the-beast.com/t/cant-start-crashlanding-server-unable-to-launch-forgemodloader/6028. Installing the JAR fixed the issue and allowed me to start the server!

Totally unexpected, very frustrating

In order to test my Minecraft server, I started the FTB Launcher on my Ubuntu 16.04 main computer. From the launcher, I started the FTB Monster pack: crash. OpenJDK 8, again. I had to apply the JAR patch on my client as well. I did it (instead of fiddling to manually install JDK 7) and that worked. I was able to log on my server and enter my world. However, as soon as I pressed F12 to go full screen, screen went blank and everything was blocked. No way to go out of the game by switching desktop, no way to kill the game window with ALT-F4. I would once again have to go to another machine, SSH into my main computer, kill the JVM, fail, try with kill -9. Instead, I just rebooted the machine, tried with Windows, and that worked. My Minecraft setup was correct. Just the client now requires a different video card or driver to work reliably on Ubuntu, but I changed from onboard Intel HD to a NVIDIA GeForce addon card in 2013 just for that reason. Having to switch back and forth graphic cards from Ubuntu versions to versions is a total non-sense for me.

Kodi is gone

I don’t know exactly how that happened, but Kodi, the new name of XBMC, got removed during the upgrade. Just reinstalling it was simple and enough to fix this. Kodi still works fine, for music and video playback. ProjectM visualization is still broken, though, but that’s not a big deal. I didn’t hear the audio distortion since the upgrade, but it’s too recent to tell if it’s gone for good or not.


For now, I’m not sure it was worth it but at least it didn’t break things. Main functionalities of my HTPC are still there: Minecraft server runs, I was able to listen to YouTube videos, Kodi works for music and videos, SSH is  working properly. I’ll have to see if other surprises are awaiting me.

Taking control of his own machine

Not being administrator on his own Windows-based PC or laptop is a real shame. It prevents the installation of most software programs and some settings are not accessible. This issue is most commonly caused by system administrators in a need for a power trip, but it could also happen on a home computer configured for multiple users. One could run on user accounts and sometimes, less and less often, switch to an administrator account to install software programs. The inevitable then happens: forgotten administrator password.

The simplest solution in this case is to wipe the computer and reinstall Windows, but I needed to do better than this two years ago. This post describes what happened and what I did to get around the issue. Anyone trying this should be careful and be aware that this could cause trouble, especially if the gained privileges are misused afterwards. I only gained administrative privileges on a testing ultrabook. That couldn’t and didn’t grant me any permission on other systems.

A new but limited ultrabook

Friday, April 26 2013, I got a new Windows 8 ultrabook at my workplace. It was officially to test a Windows-based virtual assistant we were developing at that time, but that machine could do more: temporarily replace my official work laptop which was becoming too sluggish. Replacement of the old laptop was delayed for procedural reasons. I knew I could install my stuff on the ultrabook without disturbing the virtual assistant application, so the ultrabook could perform both functions.

The Monday after, I was heading to the Burlington office of my company to provide technical support for people there. I wanted to bring that new ultrabook with me so I needed to install a couple of programs on it before leaving. Unfortunately, I quickly noticed, Friday at the end of the day or during the weekend, I don’t remember, that I couldn’t install JDK on the machine because I was not administrator. I wasn’t sure I would be able to get IT from granting me the administrative privileges by Monday just before leaving and wanted to get some stuff installed before Monday.

Feeling a bit cow boy, I wanted to hack my way around this issue. Not being administrator on my corporate laptop is a concern for me. At my current workplace, this is not an issue, but I heard this is a problem in other companies. Having a last resort way out seemed useful to me. I just found out this way, and that leaves almost no traces if everything goes well. Keep in mind this impacts just the hacked computer, nothing else on the network.

Shutting down Windows 8 properly

The main idea of my strategy was to boot the ultrabook into Linux, mount the Windows partition and hack the registry to do something about the unknown administrator password. For this, Windows 8 has to be shutdown properly. There is a new feature called hybrid startup causing the shutdown to be unclean and preventing Linux to mount the Windows partition read-write. Fortunately, this can be worked around by cleanly shutting down the PC. The simplest way is to start a command prompt (Windows key + R, then cmd), and type shutdown /s /t 0. Two years ago, I also found out I could hold Shift key while clicking on the Shutdown button, but I’m not sure this works anymore.

Booting Linux

Then I needed to boot into Linux. The simplest solution is to use Offline NT Password Recovery & Registry editor, but it was not compatible with UEFI at that time and I wasn’t sure I would be able to perform a non-UEFI boot on this Dell’s XPS13 ultrabook.  Moreover, I cannot find the download anymore for the tool. It seems that we now have to email the author to get the hidden link. I find this quite bad practice and when that happens, have a tendency to look elsewhere.

I thus tried to boot Ubuntu, and I had to do it from a USB key because there is no CD/DVD drive in the XPS13. I don’t remember exactly how I got the Live USB key. I probably used the Live CD/DVD/USB Creator tool built into Ubuntu, but other pages such as this one give clues about how to create it from Windows.

I then had to modify the BIOS/UEFI settings of the ultrabook to alter boot priority. If I remember well, I had to hit F2 while the XPS13 boots, before Windows starts of course. I managed to get the ultrabook from UEFI boot the USB stick, but that crashed after the boot. I thus had to enable legacy boot and then boot the USB key in MBR, non-UEFI mode.


After I successfully booted into Ubuntu Live USB, I started a terminal and entered sudo apt-get install chntpw. This installed the Offline NT Password Recovery tool. I just tested while writing this post on a Ubuntu 15.04 box and that still works!

After the tool was installed, I of course started it: sudo chntpw. I followed the instructions. I was offered the opportunity to reset the administrator password, but I didn’t like this, because I would not be able to restore the ultrabook in its original state: my hack would leave a trace. I found a better option: active the hidden Administrator account! After this was done, I rebooted into Windows and was able to log in as Administrator.

I don’t remember if I absolutely had to restore UEFI settings to disable legacy boot in order for Windows 8 to boot again, but I did it for my intervention to be as clean and traceless as possible. At worst, I would have obtained an error message when attempting to boot without the USB key and would have had to alter boot priority and/or disable legacy boot: no harm done to Windows.

One step further

The problem was solved, but I wanted to step even further: transfer the gained administrative privileges to my regular user account! For this, while logged in as the local Administrator, I had to access Control Panel, then Administrative settings, then Local users and groups. Unfortunately and very shockingly, this option has been completely hidden away in Windows 10: you once again have to search on Google and figure out you need to press the Windows + R keys to open the Run dialog, type lusrmgr.msc, and click/tap on OK. I hope one day Microsoft will understand this is very bad and frustrating practice that will make many power users, including me if I could, migrate to Mac OS X.

I then selected Groups, double-clicked on Administrators and clicked Add to add a member. The system offered me a dialog box to type the user name to add, but Windows was unable to find my user name of the form <company name>\<user name>.

I don’t know how I thought about it, but I figured out that Windows would need to access my company’s active directory service to resolve user names to IDs. Since I was at home, I needed to establish a VPN connection. I thus installed the Cisco VPN client on the ultrabook (I would need it anyway afterwards), then was able to add my user account to the local Administrators group. I don’t know exactly how I got the VPN client: maybe I had one copy lying around on my main computer for obscure reasons, maybe I turned on my main corporate laptop to download it, don’t remember. I was also able to hook up to the VPN from Ubuntu without a tool downloadable only from my company’s Intranet. But I got VPN and that worked.

After I did that, I logged back as my regular user, was able to install JDK without any issue, then I went back into Local Users and Groups, selected Users, double-clicked on Administrator and disabled the account. That closed the back door I used to gain administrative privileges, without taking away my new rights.

Will this always work?

No. Unfortunately, I can imagine ways to prevent this trick from working. The easiest way is to set up a password preventing access to the BIOS settings. Not being able to modify BIOS settings means impossibility to alter boot priority. With that enforced, the only workaround would be to remove the SSD from the machine, install it in another computer running Ubuntu and run chntpw, making sure it would work on the SSD, not on a potential main Windows install in dual boot on the Ubuntu box! Removing a SSD from a laptop or ultrabook is sometimes a risky operation, sometimes requires disassembly of the keyboard, memory modules, casing, etc. Not sure I would have attempted it.

Of course, the latter workaround miserably fails if the disk is encrypted, e.g., with Symantec’s PGP Whole Drive Encryption. One possible workaround may be to get the SSD out again, install it on a Ubuntu box itself running Symantec’s PGP and, if the encrypted drive’s password is known, maybe it is enough to decrypt the drive and mount it, allowing chntpw to work on it. It could also happen that the encryption key is made of the user’s password and a hash derived from computer’s information. In that case, it could be quite hard to work around the protection. One possibility, if the BIOS is not password-protected, may be to boot into a Live USB Ubuntu, install the encryption tool and try to decrypt the drive on the local computer itself.

Windows 10: a new hope or not?

Since I moved to Windows 8 two years ago, I experienced several issues with my system. There was nothing major, and only suspicions that the cause was Windows 8 itself, so I was worried about finding the same issue after downgrading to Windows 7. I thus kept that Windows 8 installation and lived with the hurdles.

In particular, Windows 8 broke NTFS support in Ubuntu, periodically preventing my hard drive to show up. I had to disable the new hybrid startup to get rid of this problem. However, a few months later, the issue showed up again until I completely disable hibernate using an obscure impossible to remember command. That had a strange side effect of shutting the computer down after the computer was in standby for too long, so I had to disable automatic standby as well.

One day, all of a sudden, system completely stopped working, I had to refresh the PC, which completely destroyed all my configuration. Instead of reinstalling all drivers and applications, I just restored a CloneZilla image.

Sometimes, login becomes slow. The computer starts at normal speed, I reach the login screen, then I have to wait 30 seconds between the time I type my password and reach the desktop. Usually, I’m not experiencing this ridiculous delay, but it happens often enough to bother me. I have a Core i7 with a SSD, so I find this quite bad that Windows compensate the hardware efficiency without software delays!

There is also that intricate audio issue making computer-assisted music a pain: computer refuses to shut down after a session in Ableton’s Live, Live suddenly refuses to start and requires reinstallation of Visual C++ libraries, sound starts to be choppy when using ASIO for playback or recording, S/PDIF distortion with my M-Audio Fast Track Pro when hooked to my UltraNova synthesizer, etc.

The only “solution” I was getting was to downgrade to Windows 7, because Microsoft is releasing one good version of Windows out of two. But I was worried that downgrading would cause me activation issues and didn’t want to come back with my old Windows 7 problem of low contrast between selected and unselected menu items. I have this issue at work and the only fully working patch is to completely disable Aero theme, falling back to classic theme.

The upgrade

Rather than letting Microsoft decide for me when I would get this upgrade to Windows 10, I downloaded the Windows 10 setup tool and ran that in order to download the new system and transfer it on a USB key. I put this USB key aside for the day I would be ready to attempt this upgrade.

I tried the upgrade on Saturday, August 22 2015, a few weeks after the official release. Before my attempt, I checked that all my main applications and device drivers would be available. I also backed up all my data and created a new CloneZilla image of my SSD containing Windows 8.1 and Ubuntu 15.04 in dual boot.

My first idea was to completely wipe Windows 8.1’s partition and install Windows 10 fresh, eliminating all quirks and issues that could arise from this old and possibly altered Windows 8.1 setup. Unfortunately, things didn’t go as straight as I expected. I was certainly able to boot from my Windows 10 USB stick, reached an installer, but I was blocked at the step requesting a product key. None of my Windows 8 and Windows 7 keys worked. There was a button to ignore the step, I thought about trying that and attempting the activation later, maybe an update would allow my old product key to work, or maybe the validation of the product key required Internet connection which was not available because my network interface wasn’t supported or initialized at this time.

Instead of running the risk of not being able to activate my freshly installed Windows 10, I turned on my ultrabook and searched on the Internet. I first got a forum post suggesting to call Microsoft, maybe they would be able to perform the activation by phone even though it doesn’t work by Internet. No way! I didn’t want to spend frustrating minutes trying to enter a validation key that the operator would dictate me, one hand on the keyboard, one hand to hold the phone, just because Microsoft cannot evolve. Fortunately, I searched a bit more and found out that the upgrade process allowed to wipe out pretty much everything: installed application and user’s configuration.

I thus decided to try this instead of fiddling with activation issues I tried to avoid since two years by refraining from downgrading to Windows 7! I thus restarted into Windows 8.1 and executed the setup program on my Windows 10 USB key.

I had the choice between preserving all my applications and data, only the data or nothing at all. I first thought about the third option, to start as fresh as possible, but I was worried that Windows could destroy all my partition layout, including my data drive. I didn’t want to reinstall Ubuntu and uselessly restore all my data from backups so I chose the second option: preserve data but remove applications.

After the setup program restarted my computer, I got stuck with a boot error message. I first thought Windows installation messed up and I would have to attempt the clean install and then work around activation hurdles, but I quickly found out that the error was related to GRUB. A bit annoyed by the fact once again, Windows broke GRUB which is needed to boot Ubuntu, I restarted my machine and changed the boot option to start Windows instead of Ubuntu. I was then able to resume Windows installation, which went well after this small hick up.

Cannot login!

After upgrade completed, I got the new welcome screen, very similar to Windows 8.1’s. I entered my usual user name and password and got an error message: invalid password. I tried many times, same issue. I first thought about this stupid annoying insane caps lock: no, caps lock was off. I then thought there was a networking issue. Since I am using a Microsoft account to login, my password is stored both on my local machine and on Microsoft’s server. The format of the password cache may have changed between Windows 8 and 10, so a first login in Windows 10 could require network access. Maybe, I thought, the network interface is not detected or requires a driver that I would have to install in safe mode. Quite bad, definitely Windows installation is harder and harder and we will soon have to forget about any upgrade, unless we get a new computer with preinstalled OS.

Fortunately, the problem was simpler, far simpler, almost shockingly simpler: Windows 10 reset keyboard to France French AZERTY! I found an icon that allowed me to set the keyboard at login time back to Canadian French, then my password worked!

Good news

After these initial issues (cannot clean install, killed GRUB and login problems), I was able to reach the desktop and things went quite smoothly. Windows 10 desktop is quite similar to Windows 7.

Capture d'écran 2015-08-29 13.55.45

The start menu, which was removed from Windows 8, is back again and works pretty well.

Capture d'écran 2015-08-29 13.56.02

The contrast issue between selected and unselected menu items didn’t come back. The new start menu is a bit easier to use than Windows 7 one.

I didn’t care about the personal assistant Cortana and the new Web browser Edge, but I really liked the fact that the Alt-Tab finally works correctly. Since Windows 7, when I was pressing Alt-Tab and holding Alt, pressing Tab to toggle between opened windows, I always had to be careful not to select the desktop which was listed in the proposed targets. I did this mistake again and again, especially when struggling with problems, and that makes things annoying. The only workarounds was to stop using Alt-Tab and fiddle with the mouse instead, or alleviate with solutions such as Virtuawin. Windows 10 helped with that by removing this fake desktop window from the targets proposed by Alt-tab switcher.

Capture d'écran 2015-08-29 13.56.21

Even with that small improvement, there is still a need to group windows into virtual workspaces for efficient navigation. Windows 10 finally addressed this through builtin virtual desktops. This feature is activated by pressing Winows-Tab, then it is possible to pick another desktop or create a new one.

Capture d'écran 2015-08-29 13.57.08

I was worried that Microsft would, like Apple, implement this in a poor way, making it totally useless. On some implementations of virtual desktops, namely the Apple one, but also on some versions of GNOME 3, Alt-Tab shows opened windows from all desktops, making the grouping totally useless for me. Virtual desktops is then useful only for people able to have multiple windows opened side by side on the screen. In my case, I almost all the times have a maximized window because with the larger fonts I need to use, I cannot stuff as much information in windows than with most other users. I was happy that Windows 10 correctly honored the grouping of windows when pressing Alt-Tab.

However, I’m still unsure this will be efficient for me to use. For now, I didn’t find any effective way to go from one desktop to another. I had to press Windows-Tab, then Tab, then arrow keys, then Enter. I will probably always mess up in the sequence, e.g., press arrow keys before Tab. However, the ackward user interface may be compensated by better reliability since the feature is builtin rather than hacked using windows hiding like Virtuawin does. I hope I will get less random issues like keyboard not working after switching to a new desktop, Virtuawin offering to close itself when pressing Alt-F4, instead of closing the current window, etc.

Another improvement is the possibility of disabling DPI scaling for 64 bits application without fiddling into the registry. Up to Windows 8.1, this was possible only for 32 bits applications, so for Ableton Live, which causes issues with DPI scaling, I had to use a registry tweak. This is annoying, hard to remember and prone to disasters. What if by mistake I remove a registry key?

I also liked that the Explorer now groups the favorites and libraries in the same list rather than having two separate lists. Since Windows 8.1, i have to spend almost 20 seconds each time I want to reach my Dropbox folder. When I start Explorer with Windows-E, the Dropbox shortcut isn’t shown so I have to scroll up. Mouse wheel doesn’t work so I have to locate the too small scroll bars and use that, or try with my touch screen. Sometimes it works, sometimes it moves stuff around!

Capture d'écran 2015-08-29 13.56.12

Software compatibility

I read quite a bit of concerning forum posts about broken programs in Windows 10. In particular, several people had issues with Ableton’s Live 9.1, the program I use for computer-assisted music. I’m underusing it quite a bit for now, but I would like to continue exploring it. My hope is to make better use of it at some point in my personal progression in musical creation. Some people were saying Ableton’s Live 9.2 Beta version worked better. Fortunately, that Beta became final before I upgraded to Windows 10 and Ableton didn’t charge upgrade fees, so I got the new version without any hesitation. For now, it works correctly, but I didn’t try to push it hard yet: no ASIO, no multitrack recording, etc. It will come, and hopefully it will have less issues than with Windows 8.1.

I didn’t install the driver for my M-Audio interface yet. I’m still using the interface built into my UltraNova, for which Windows 10 compatibility is official as opposed to M-Audio’s Fast Track Pro. My concern is that the installation may mess things up and cause issues that will be entangled with other problems. I will thus make sure everything is stable before dropping this driver in, and probably even be as paranoid as creating a new CloneZilla image before installing that piece of software. If the M-Audio interface is flawed with Windows 10 as with Windows 8.1, I will have to consider purchasing a new one: minimum four inputs, maybe eight if that’s not over-expensive, I’ll see. If Live is also unstable, I may have to try my luck on a brand new Mac and probably end up setting a lower resolution than my LCD native one because fonts are too small on Mac OS X and cannot be enlarged in a consistent way.

Ninite installed most of my main applications: LibreOffice, Firefox, Thunderbird, etc. GNU Emacs still works, same for Minecraft, both FTB Monster and Agrarian Skies 2 packs. I also installed the latest version of Bandicam, which seems to work, but I didn’t perform any gameplay recording since my upgrade.

I don’t know about Corel’s VideoStudio yet. I am using this sometimes flaky tool to perform basic editing on my Minecraft gameplay videos. I am planning to upgrade this thing to the latest version, which hopefully will address potential Windows 10 issues. I didn’t read any positive or negative reports about this software program on the new Windows.

I’m a bit concerned with VirtualBox, whose version 5 has issues with Windows 10. They don’t tell anything about version 4.3, which I chose conservatively because I was putting up a virtual machine at my work place shared with colleagues. Fortunately, I don’t absolutely need VirtualBox for my personal use now. It may just be useful as a backup solution if I work from home one day and my work laptop fails, but I still have to check that Cisco’s AnyConnect VPN correctly works wih Windows 10. Anyway, i still have the option to boot into Ubuntu, where Cisco’s VPN and VirtualBox work!

Ubuntu threatened once again

The day after my Windows 10 upgrade, I tried to repair my Ubuntu boot. First, I booted back into GRUB, hoping it would work. Windows 10 should have only changed its stuff in the ESP, leaving GRUB intact. No, no luck. I was sent to a rescue prompt. I tried to enter commands without success. I tried “help”: still no luck. I would thus have had to check on the Internet to figure out what basic rudimentary commands this tool accepts. Why didn’t it offer online help?

Tired of fiddling with Ubuntu, I rebooted into Windows 10, now ready to delete its partition, enlarge Windows 10 partition and install a fresh new Ubuntu in a VirtualBox virtual machine. I was a bit sad to downgrade Ubuntu from being a first class operating system to a Microsoft’s slave, but I felt it was better for my mental health to do it sooner than later.

However, I got blocked by multiple unknown partitions preventing me from just enlarging Windows 10 space. I would have to remove my Linux partitions and then move these unknown partitions, unless I knew for sure I could delete them as well. To figure this out, I had to reboot into a Ubuntu Live DVD. I then found out that I never downloaded any Ubuntu 15.04 ISO! I ended up trying with 14.10 and got confirmation that the unknown partitions contained Windows recovery data: better to preserve them.

While booted into Ubuntu Live DVD, I decided to restore GRUB. This went well, but I had to use the contorted method consisting of making a chroot environment with my Ubuntu installation and reinstalling GRUB from there

Surprise: the nasty pig once again wasn’t finding out my Windows partition! I had to search on my blog posts to figure out how I addressed this in the past. Last time, my ESP wasn’t set up with the Boot flag. But this time, my ESP was correct. Oh no, don’t tell me Windows 10 messed up with things so this time I will have to manually add it to the GRUB menu, and redo it each time something upgrades the kernel! Before resorting to do and redo that, I rebooted into Ubuntu, which worked, tried to rerun update-grub, and this time, Windows loader got detected! Phew!

Why does standby and hibernate work only on laptops?

The day after the upgrade, I left my machine unattended for some time. When I came back, it was in standby mode. I turned the computer back on, things seemed to work right, then poof, blue screen. According to the error message, there was a corrupted driver. System had to reboot once again. This is not the first time I have to reboot the whole system to get out of standby or hibernate, and that happens only on desktop computers. On laptops, standby and hibernate wok correctly. I got more issues on Linux with this than on Windows, but it also happens on Windows, without any clear solution other than trying random things and reinstalling pretty much all drivers, without any chance of success. Maybe standby and hibernate should just be disabled by default on desktops, this is just too annoying to have to reboot to get out of this state! After that issue, I just disabled the automatic standby, so it won’t happen again until I decide to give it a new shot later on. I didn’t get other blue screens after this.

Seems NTFS-3G requires a patch that doesn’t come up

As I wrote above, since Windows 8, I am having issues with mounting my NTFS partitions under Ubuntu. The partitions just don’t mount, until I reboot into Windows, then reboot back into Ubuntu. NTFS-3G, the driver used by Linux to read/write NTFS, would definitely need to be patched to deal with incorrectly unmounted partitions. This is far from great, but this is needed because Microsoft is doing messy stuff with NTFS. Before we get this patch, all we can do is disable hybrid startup.

This time, the issue was more severe. Rather than just not mounting NTFS drives, Ubuntu refused to boot completely. I don’t know exactly why, and wasn’t able to perform any diagnostic, because the rescue prompt that came up was shown with too small fonts. I ended up rebooting, which just froze things up, no way to start up Ubuntu, even in single-user rescue prompt!

When this happened, next Tuesday after my upgrade, I rebooted into Windows to make sure at least my SSD wasn’t dead, then rebooted back into Ubuntu… with success!

Later on, I disabled this hybrid startup once again, and that seemed to have fixed things. I didn’t get other Ubuntu issues since then.

How about my ultrabook?

The situation on my Lenova IdeaPad ultrabook was a bit complicated. First, the machine had Windows 8.1 home edition, so unless I upgraded it to professional edition first, I had to download another installation medium instead of reusing the USB key I created for my main PC. Moreover, I got no prompt offering me to upgrade to Windows 10 like the ones that showed up on my main PC, so it was possible the machine runs a special Lenovo version of Windows 8. If I upgrade to Windows 10 in such a case, it may either just fail, either I will get flaky behaviors.

In particular, when I flip my ultrabook into tablet mode, the mechanism disabling the keyboard is software-based. Some people who tried Ubuntu on this model reported that the keyboard was still working when the machine was flipped into tablet mode. Without Lenovo’s customizations and drivers, I may get this incorrect behavior in Windows 10. Even worse, the touch screen may just not work. I thus had to be careful when upgrading this machine, and make all possible backups before, and be mentally ready to fight against this machine and then downgrade back to Windows 8.1.

On September 19, 2015, things changed slightly: I got the notification about upgrade being available for my ultrabook. I proceeded with the upgrade on Saturday, October 3, ater I backed up the machine using CloneZilla. The upgrade happened without issue, except the machine seemed slower after. However, things settled after a few days and the system is responding correctly.

Cannot transfer files anymore from my Galaxy Nexus through USB

Last friday, I tried to hook up my Galaxy Nexus phone through USB to transfer some files to my computer. After a few seconds, a completely empty Nautilus window appeared. Once again, Ubuntu was incapable of detecting my device. This happened a few versions ago and I had to use obscure and impossible to remember MTP commands to perform transfers. I didn’t want to search for these commands again, tired of loosing more and more time with artificial problems. If Ubuntu degrades at this point from one version to the other, I will be better off switching to Windows and install Linux only in virtual machines. This is not the only issue I got and most bugs (mouse pointer, Emacs, M-Audio sound stability) persist with version upgrades. Canonical now seems to focus on Mir and newer Unity versions, which I really dislike because Mir will break everything down for five or six versions. Either this, or Canonical will cut corner on keyboard accessibility, resulting in a UI that will be almost unusable for me. I exptect this will  be my hardest Linux time ever when that beast comes out, until it stabilizes.

However, my actual bug was worse than this: the phone didn’t connect through Windows as well! When I plugged in my phone on a Windows machine, an empty Explorer window similar to below comes up and nothing else. Does that mean my phone is dying, progressively loosing functionalities? Probably.

Capture d'écran 2014-11-30 09.04.15

I didn’t atempt any Google search about this. It is worthless. I will find forum posts about people replacing the cable, doing factory resets, sending their phone for repair or replacement, etc. The phone hooks up, something is detected by the computer, so why a new USB cable would help? Yes, I can do factory resets after factory resets, that will probably fix it, but what’s the point of doing this if I know I will have to redo it a few months later, for no reason, unless I install NOTHING on the phone? And how I am getting through technical support are no-go solutions or offers for a new phone that will force me to switch to a more expensive plan with my provider or buy something from nowhere with an old version of Android.

Before accepting this conclusion and starting to look if I could get a Google Nexus phone from Google rather than going through Fido and affecting my phone/data plan or get something somewhat correct from DX.com, I tried with a different cable: same effect. Then I saw the home screen on the device and remembered I set up a PIN recently to protect my Google account for any tampering by somebody who may get back my phone if I loose it, or steal it from me.


I entered my PIN and saw with surprise and relief the following window on my computer:

Capture d'écran 2014-11-30 09.05.34

Tada! This time, the solution was simple! This was just a normal data protection! So my USB connection is still working!

Note that locking the phone doesn’t shut off the USB access, until the cable is disconnected. The PIN also doesn’t prevent me from answering a call, so this is not as problematic as I feared it would be.

One SSD for my HTPC

A bit more than a month ago, I successfully transferred my dual boot Windows 8.1 and Ubuntu 14.04 from two 120Gb Solid State Drives (SSD) to a single 240Gb drive.  I got several problems restoring the bootloaders of the two operating systems, thought many times I would have to reinstall, then figured out a way to make them boot.

But what happened to the two drives I removed from my main computer? Well, they sat still on the top of a shelf. But at least one drive will be repurposed: become part of A.R.D.-NAS, my HTPC. Sunday, October 26 2014, I finally got the time and courage to undertake the transfer operation. This time, the software part was pretty smooth, but the hardware part was a uselessly intricate puzzle. During the process, I wondered myself many times about the purpose of generic hardware if it doesn’t fit well together and pestered about the lack of any viable alternatives.

The sacrifice

Well, my NMedia HTPC case has six 3.5″ drive bays. This is quite nice for an HTPC case. This is possible, because I chose an ATX case, to get a motherboard with rear S/PDIF audio connectors rather than just headers accepting brackets I could get nowhere. This case is a bit bulky; I would build off a MicroATX case if I had to start from scratch.

So installing this SSD seemed obvious at start: just add the drive, transfer the Linux boot partition from the hard drive to SSD, remove the original boot partition, setup GRUB on SSD and tada. No, things are rarely as simple. I thought my motherboard had only 4 SATA ports, and they were all used: one 1Tb hard drive, a second 1.5Tb hard drive, a third 3Tb hard drive, then a blu-ray reader/writer. Why so many hard drives? Well, I am ripping and storing all my video disks, even the huge blu-rays, to avoid the need for searching for them on shelves.

Even if I remembered correctly I had six ports on the motherboard (two are free!), my PSU only had four SATA power connectors, so I would not be able to easily and reliably connect all my drives. I could try to find some splitter cables or molex to SATA adapters, but that would add a factor of failure. I could replace my PSU for one with more SATA power cables, but it would also have more molex cables, PCI Express connectors, etc. Unless I went with a more expensive modular PSU, all these cables would have cluttered my case.

Safest and cheapest solution was to sacrifice one of the hard drives, the 1Tb one of course, the smallest. I thus had to move files around to have less than 120Gb of stuff on the hard drive that would be moved to SSD. That process took a lot longer than I thought. My poor HTPC spent the whole Saturday afternoon copying files around! Fortunately, this is machine time so I had plenty of time to experiment music creation with my still new UltraNova synthesizer combined with Ableton’s Live multri-track ability.


On Sunday, I first burned the Ubuntu 14.04 ISO on a DVD. Yes, Ubuntu is now large enough to fit only on a DVD. After that, I shut down my Minecraft server running on my HTPC and moved its files to another old PC. I started the server on the old PC and reconfigured the port mapping on my router. This way, if my friend wanted to kill a bit of zombies and creepers while I was installing my SSD, he would be able to do so and I would not be stressed if something bad made my HTPC out of service (like something stuck in the CPU fan breaking it).

I then removed the cover of my HTPC and spent quite a bit of time trying to figure out what was the 1Tb hard drive. Based on the position of the SATA connector on the motherboard, I presumed that was the left-most drive. I thus had to disconnect the drive in the middle bay, and use the freed up power and data connectors to hook up my SSD. I then booted up the machine.

Following picture shows the drive temporarily hooked up.


Then I remembered about my old 22″ LCD that I stopped using after purchasing my Dell touch screen. I went pick it up in my computer room, put it on my kitchen table and plugged it in. This way, I would be able to have the screen right in front of me, keyboard on the table, rather than in front of my 46″ LCD HDTV with the keyboard on my knees.

The SSD and the LCD hooked up, I booted up my HTPC and quickly sticked the Ubuntu DVD in the blu-ray drive. After an incredible amount of time, the machine finally booted up into Live Ubuntu DVD!

Data transfer

After Ubuntu started, I launched GParted and realized that I chose the wrong hard drive. The 1Tb drive containing my Ubuntu setup was disconnected. Oh no! So that means I will have to turn off the machine, connect the right drive and wait once again for this stupidly long, almost five minute, DVD-based boot? No, not this time! Feeling like a cowboy, I decided to try something: drive hot swapping. This is possible with SATA, so let’s see! I thus disconnected the 1.5Tb hard drive, starting with the SATA data cable, then the power cord, then hooked up the 1Tb drive. Hearing the old hard drive coming back to life was kind of thrilling. Everything went well, no cable stuck into my CPU or rear fans, and the PC didn’t freeze like it would do with IDE. The hot swap worked.

After that, this was relatively straightforward. As with my main PC, I used GParted to transfer my Linux boot partition and reconstruct the layout. I fortunately remembered, before, to reset the partition table. If I didn’t do that, the GPT that was on my SSD would have caused booting issues that would have drove me mad! I would probably have ended up reinstalling everything, angry against Ubuntu, the technology and probably the whole human kind. A single step, recreate the msdos partition table from GParted before the transfer, saved me that!

Following picture shows my LCD on which we can see the progress of the transfer.


See how bulky was this setup: HTPC on the floor, case opened, SSD hanging on top. Hopefully it was possible to make this setup clean once again after all this.


The home partition: too big to fit on the SSD

Unfortunately, GParted didn’t want to transfer my home partition to the SSD, because it was obviously too large. I could have shrunk it in order to copy it, but I wanted to avoid altering the hard drive in case something bad happened. I thus instructed GParted to simply create a blank Ext4 partition and used cp to perform the copy. The following terminal session shows how I managed to do it in such a way that all files metadata (timestamps, ownership, permissions) was preserved.

ubuntu@ubuntu:~$ mkdir /media/old-home
mkdir: cannot create directory ‘/media/old-home’: Permission denied
ubuntu@ubuntu:~$ sudo mkdir /media/old-home
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sda

Disk /dev/sda: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000e2c4d

   Device Boot      Start         End      Blocks   Id  System
/dev/sda1   *          63    40965749    20482843+  83  Linux
/dev/sda2        40965750    81931499    20482875   83  Linux
/dev/sda3        81931500  1953520064   935794282+   5  Extended
/dev/sda5        81931563  1943286659   930677548+  83  Linux
/dev/sda6      1943286723  1953520064     5116671   82  Linux swap / Solaris
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sda5 /media/old-home/
ubuntu@ubuntu:~$ ls /media/old-home/
eric  lost+found  mythtv
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sdg

Disk /dev/sdg: 120.0 GB, 120034123776 bytes
255 heads, 63 sectors/track, 14593 cylinders, total 234441648 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000d9a0c

   Device Boot      Start         End      Blocks   Id  System
/dev/sdg1            2048    40968191    20483072   83  Linux
/dev/sdg2        40968192    81934335    20483072   83  Linux
/dev/sdg3        81934336   234440703    76253184    5  Extended
/dev/sdg5        81936384    92170239     5116928   82  Linux swap / Solaris
/dev/sdg6        92172288   234440703    71134208   83  Linux
ubuntu@ubuntu:~$ sudo mkdir /media/new-home
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg6 /media/new-home/
ubuntu@ubuntu:~$ sudo cp -a /media/old-home/* /media/new-home
ubuntu@ubuntu:~$ ls -a /media/new-home/ -l
total 36
drwxr-xr-x  5 root root  4096 Oct 26 19:49 .
drwxr-xr-x  1 root root   100 Oct 26 19:44 ..
drwxr-xr-x 69 1000 1000 12288 Oct 25 22:52 eric
drwx------  2 root root 16384 Sep 26  2009 lost+found
drwxr-xr-x  3  122  130  4096 Jan 24  2011 mythtv

The main idea is to mount both the old and new partitions, then use cp with -a option and root access (with sudo) in order to preserve everything. The operation went smoothly.

The boot loader

Even after copying all Ubuntu-related data from my old hard drive, my SSD was still not bootable. To make booting off the SSD possible, I had to install GRUB. Unfortunately, reinstalling GRUB on Ubuntu is not as simple as it should be. If there is a package doing it, why isn’t it built into Ubuntu’s image? Maybe because for most setups, reinstalling from scratch takes 15 minutes. That’s true, but then how about tweaks to fix mouse pointer too small, make XBMC work with S/PDIF sound, reinstall MakeMKV, etc.? Each step is simple, at least when no unexpected difficulty creeps in, but the sum of small things to tweak makes it long.

So let’s avoid this by running the following!

ubuntu@ubuntu:~$ sudo mkdir /media/ubuntu
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg1 /media/ubuntu/
ubuntu@ubuntu:~$ sudo mount --rbind /dev /media/ubuntu/dev
ubuntu@ubuntu:~$ sudo mount --rbind /sys /media/ubuntu/sys
ubuntu@ubuntu:~$ sudo mount --rbind /proc /media/ubuntu/proc
ubuntu@ubuntu:~$ sudo chroot /media/ubuntu
root@ubuntu:/# grub-install /dev/sdg
Installing for i386-pc platform.
Installation finished. No error reported.
root@ubuntu:/# update-grub
Generating grub configuration file ...
Warning: Setting GRUB_TIMEOUT to a non-zero value when GRUB_HIDDEN_TIMEOUT is set is no longer supported.
Found linux image: /boot/vmlinuz-3.13.0-37-generic
Found initrd image: /boot/initrd.img-3.13.0-37-generic
Found linux image: /boot/vmlinuz-3.13.0-36-generic
Found initrd image: /boot/initrd.img-3.13.0-36-generic
Found linux image: /boot/vmlinuz-3.13.0-35-generic
Found initrd image: /boot/initrd.img-3.13.0-35-generic
Found linux image: /boot/vmlinuz-3.2.0-61-generic
Found initrd image: /boot/initrd.img-3.2.0-61-generic
Found linux image: /boot/vmlinuz-3.0.0-17-generic
Found initrd image: /boot/initrd.img-3.0.0-17-generic
Found linux image: /boot/vmlinuz-2.6.38-12-generic
Found initrd image: /boot/initrd.img-2.6.38-12-generic
Found linux image: /boot/vmlinuz-2.6.32-25-generic
Found initrd image: /boot/initrd.img-2.6.32-25-generic
Found linux image: /boot/vmlinuz-2.6.31-21-generic
Found initrd image: /boot/initrd.img-2.6.31-21-generic
Found linux image: /boot/vmlinuz-2.6.28-16-generic
Found initrd image: /boot/initrd.img-2.6.28-16-generic
Found memtest86+ image: /boot/memtest86+.elf
Found memtest86+ image: /boot/memtest86+.bin
Found Ubuntu 14.04.1 LTS (14.04) on /dev/sda1

The main idea here is to create a chroot environment similar to my regular Ubuntu setup, then install GRUB from here. I ran update-grub to make sure any disk identifier would be updated, pointing to the SSD rather than the old hard drive. Unfortunately, a small glitch happened: update-grub detected the Ubuntu setup on my hard drive. To get rid of this, I had to unmount the old hard drive and unplug it! After rerunning update-grub, I got the correct configuration.

Updating mount points

Since I rebuilt the home partition rather than copying it, its UUID changed so I had to update the mount point in /etc/fstab. I thus had to run the following:

root@ubuntu:/# cd /dev/disk/by-uuid/
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg6
lrwxrwxrwx 1 root root 10 Oct 26 15:37 fb543fcb-908a-463d-bc1f-896f1892e3ad -> ../../sdg6
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg1
lrwxrwxrwx 1 root root 10 Oct 26 16:11 54f4cbd6-aed0-4b43-91c0-2f8d866f3ee3 -> ../../sdg1

After I figured out the UUIDs, I had to open up /media/ubuntu/etc/fstab with gedit and make sure the mount points were correct. I only had to update the UUID for the /home partition.

Test boot

After all these preparatory steps, it was time for a test! I thus powered off the computer and made sure my old 1Tb hard drive was unplugged and my new SSD was hooked up. I turned on the PC and waited for the forever lasting BIOS POST. Why is it so long to boot a desktop while a laptop almost instantly hands off control to OS? After BIOS handed off control to OS, I got a blank screen with a blinking cursor, nothing else. I tried a second time: same result.

So after all these efforts, do I really have to format and reinstall from scratch? It seems so. Before doing that, I rebooted my machine once again and entered into BIOS setup by hitting the DEL key. Once there, I looked at the hard drives and found that the SSD was hooked up but it was not at SATA port 0.

I turned off the machine and connected the drive into a different port, what seemed to be the first. Looking into the BIOS setup again, my SSD was now at port 0. Ok, let’s try that a last time!

After a blank screen lasting too many seconds for a SSD boot and making me fear for a frustrating reinstall, the Ubuntu logo appeared, and my desktop finally came up! A quick check confirmed me that all the hard drives were present, except of course the disconnected 1Tb one. The SSD was ready to be installed into the machine!

The hardware part

The downside of SSD is that they seem not to fit in any regular desktop cases, only in laptops! This is a very frustrating limitation. Why are these drives all 2.5″ or why cases don’t have 2.5″ bays? When I shopped for my computer case, only high end ones had the 2.5″ bays and that was coming with fancy mechanisms to make the drive removable without plugging any cables, something adding into problem factors. Maybe at the time I am writing this post, some cases with SSD bays are available, but that doesn’t matter; I won’t change my case unless I really need to!

Before installing the SSD, I first removed that old 1Tb drive. I just had to remove four screws from my drive cage and slide the drive out.



To help me install my SSD into my HTPC case, I had a bunch of screws as well as an OCZ bay bracket. Just screwing the drive into the adapter’s tray took me forever, because I had trouble finding screws that fitted, there was a screw in one of the SSD hole I don’t know exactly why and that took me almost five minutes to realize. I then had trouble aligning the screws with the hole, was getting more and more tired and prone to drop screws, etc. At least, the screw I dropped fell on my table so I didn’t have to search it on the floor forever.

Following picture shows the drive in the bracket.


Then I had to screw that assembly into the drive cage of my case. Unfortunately, the upper bays of the cage only offer bottom holes while the SSD adapter has only side screws! I thus had to screw the adapter in one of the bottom bays, which are definitely suited for hard drives with their rubber pads to absorb vibration. None of my screws fitted well. It seems that the SSD adapter has holes smaller than normal while the screws for the drive bay are larger than usual! I got it after more than 15 minutes of attempts. I thought many times I would have to postpone this job and wait for my father to come by with a drill and make some new holes into the SSD bracket or the case.

Following picture shows the drive in the cage.


I don’t know exactly how much time I spent on this installation, but at the end, I was tired and was asking myself if all this would be worth it in the end.

Well after the SSD was screwed and the drive cage back into my HTPC case, I realized I wouldn’t be able to hook up my four SATA drives! No matter what I tried, there was always one drive lacking power. This was because the SATA cable coming out of my power supply unit were too short to accommodate the drive layout I came up with! Ok, I’m at a dead end now.

Before giving up and bringing that beast to a computer store in the hope they would figure out a way to hook the four drives up (maybe with some extension cable I don’t have, or using a new PSU), I remembered that the 1Tb drive I removed was in the middle upper bay which was now empty. My only hope to get the drive powered this day was thus to move one of my hard drives there. Ok, so let’s remove the cage again and play with the screwdriver once more!

I moved my 3Tb drive from the side bay to the upper one and put the drive cage back into my case. I was then able to hook up power. Reaching the SSD drive in the side bay to hook up SATA cables was a bit tricky, but I finally got it. A last check confirmed that all my drives were hooked up, except my blu-ray writer. Ok, just a cable to plug in, and that was it!

Was this all worth it?

After all this hassle, I asked myself this question. When I booted up the machine, it seemed as slow as with the hard drive. What? Maybe the CPU is too slow, after all. But when I reached the desktop and started XBMC, I felt the system was more responsive.

More importantly, the machine became a lot more silent. Since a few weeks, this HTPC was making a lot of noise. I thought it was the CPU fan stressed out by the Minecraft server running on the system, but the 1Tb hard drive was contributing to the noise as well. I suspect it was emitting more and more heat, causing the temperature to raise inside the case and heating up my poor little CPU. The CPU fan was then reacting by spinning like crazy.

Even after I restarted my Minecraft server, the sound didn’t come back. I am still surprised by this effect which I didn’t expect.

This 1Tb hard drive is definitely getting old and emitted some suspect sounds a few times. I am wondering if it would have failed and died if I left it in the machine. This SSD move thus saved me an unexpected reinstall and will help me have a better time with this HTPC.

So yes after all it was worth it!

One SSD instead of two: simpler or not?

My Core i7 machine, named Drake, had two 120Gb SSD drives. I purchased the first one with the machine and put Windows 7 and Ubuntu on it. Then I needed more space to get Mac OS X, so I added a second 120Gb SSD. Mac OS X became a pain, almost unusable because everything was too small. When I reached the point I had to lower screen resolution to get Thunderbird running comfortably, I got rid of Mac OS X. Then Windows 7, upgraded to Windows 8, started to eat up more space so I needed to move Ubuntu to the second SSD.

I ended up with a brittle configuration composed of the ESP (EFI system partition) on the second SSD, Windows 8.1 on the first drive and Ubuntu on the second. I was waiting for a special deal on a 240Gb SSD and finally got one on TigerDirect at the beginning of September 2014. However, purchasing the SSD is only the easy part. Migrating data from two SSD drives to a single one, with Windows 8.1, Ubuntu 14.04 and UEFI in the way, is an incredible source of headache. This page shows how I got it done.

The easy way: reinstall everything

That would have worked ten, maybe even five years ago. Yes, just reinstall Windows, a few drivers, a few programs, put back Ubuntu, perform some settings, fine tune a bit, and enjoy the rebirth of the system, coming back to life and full functionality. Things changed with years, not for good. Now that Microsoft and other hardware manufacturers assume people won’t install by themselves and rather purchase hardware with everything preinstalled and preconfigured, things became more and more time consuming to setup. Just installing Windows 8 takes more than 45 minutes, and although I could obtain a DVD with Windows 8.1, my Windows 8 serial number won’t work with it. I would have had to install Windows 8, then upgrade to Windows 8.1 again!

Then come the drivers. Since I purchased my motherboard before Windows 8 was released, all my motherboard CD has to offer is Windows 7 drivers. So I cannot use the easy auto-install tool performing an unattended setup. I rather have to download every driver separately from Asus, run them, wait, reboot, run the next one, etc. Then there is the NVIDIA driver, requiring 100 Mb of download and yet another installation taking more than five minutes, and yet another reboot. Maybe I chose the wrong motherboard. By sacrificing a few USB ports, S/PDIF audio and maybe some PCI Express slots, maybe I could get something simpler not requiring as many drivers, that would be able to make use of what is prepackaged within Windows. That’s still to be investigated.

Then come the programs. Yes, Ninite can install me many programs automatically but not GNU Emacs, GNU GPG, it won’t configure my Dropbox, resync my Firefox bookmarks, reinitialize my Thunderbird email settings. It won’t link back my Documents, Images, Music and Videos default folders to my data hard drive.

And then come the licenses. How Windows 8.1 activation will behave? Will it happen smoothly, or will Windows decide that this change of SSD is too much and require me to call Microsoft to perform activation by phone, forcing me to exchange, by voice, on a poor channel, tens of nonsensical digits? After Windows 8.1 activation, my DAW, Live from Ableton, also requires authorization. I’m not sure it will reauthorize, since I activated it on my main PC as well as my ultrabook. That means additional hassle.

Bottom line, reinstalling is a pain, and that is just the Windows side. Ubuntu installation is usually smooth, but when a single thing goes bad, it requires hours of Google searches.

This is why I wanted a better way. I was so tired of this tedious process I was considering giving up on this machine and use my ultrabook instead, if data transfer failed. But my ultrabook, with its 128Gb SSD, won’t have enough storage for editing music made of samples or recording/editing Minecraft videos.

Preliminary connection of the new SSD

Before installing the new 240Gb SSD into my system permanently, I wanted to be sure I would be able to transfer my two operating systems (Windows 8.1 and Ubuntu 14.04) and make them boot. I thus only plugged the disk rather than attaching it right away into my case. I fortunately had some free SATA power cables as well as an extra SATA cable and port. That allowed me to connect the new drive without disconnecting the others. This way, it would have been easy to roll back in case of difficulties forcing me to reinstall everything, and then think about another strategy or gather my courage and patience for the full reinstall.

I then booted from a USB stick with a Live installation of Ubuntu 14.04. This was necessary to perform the data transfer on a totally offline, clean, file system.

Before transferring anything on the drive, I ran a SMART self test. For this, I installed smartmontools with apt-get and ran sudo smartctl -t long /dev/sdb. At this time, /dev/sdb was the device of the drive. That took almost an hour, but I could leave this running and do something else.

The self-test found no defects. I learned to do this preliminary step the hard way when I assembled a machine for my parents. The hard drive failed short while I was configuring Windows and I had to RMA it. Performing a self-test may have avoided me a waste of time and some frustration.

The drive being clean from any defect, at least from the point of view of the self test, I moved to the next step: data transfer.

GParted is the king!

A long time ago, my only friend for partitioning and drive transfer was Parition Magic, from PowerQuest, now purchased by Symantec. That time is over, thanks to GParted, a free open source tool that comes with Ubuntu. But that time, my job was pushing GParted to the limits. Here are the operations I needed to perform with it:

  1. Create a GUID Partition Table (GPT) on the new SSD. This is because I want a pure UEFI-based system. But this is not strictly necessary since the drive is far from the 2Tb limit!
  2. Copy the first partition of my second SSD at the beginning of the new drive: this is the ESP.
  3. Copy the first partition of the first SSD: this is the 128Mb system reserved partition of Windows. That copy wasn’t possible, because GParted didn’t know the partition type. I thus left a 128Mb hole declared as Unformatted, to figure out a way out later on. I was hoping Windows could recreate the data on this partition.
  4. Copy the second partition of the first SSD: this was the Windows main partition.
  5. Copy the 40-ish Gb partition of my second SSD at the end of the new drive: this was my home drive from Ubuntu.
  6. Copy the 20-ish Gb partition of my second SSD at the bottom of the free space on new drive: this was my main Ubuntu installation.
  7. Create an extra 20 Gb partition on the new drive in case I would like to give a shot to a new Linux distribution.
  8. Create a 16Gb swap space on the new drive for Ubuntu’s use.
  9. Resize my Windows main partition to take the rest of the space.

Phew!This long sequence gathering pieces from different sources reminds me of infusion crafting in the Thaumcraft mod of Minecraft, where essentias and items are combined together on an altar to craft powerful magical objects.

I hoped that sequence would work, but that failed at step 5. For no obvious reason, GParted wasn’t able to copy my Ubuntu home drive at the end of the new SSD! I had to leave an 8Mb gap and then resize the partition to fill it. I then performed, one by one, the other operations. That was a quite tedious job, because the mouse pointer was too small and impossible to enlarge without a system hack (Ubuntu bug since 11.10! They chose to remove the option to resize mouse pointer rather than fixing the issue.) and sometimes clicking was opening the menu and closing it right away rather than leaving it open.

Following image gives the final layout. Isn’t that great? Not sure at all this is simpler with one drive than with two, after all…


After this transfer process, I tried to recreate the entries in my UEFI’s NVRAM, using efibootmgr, for Windows and Ubuntu. I then unplugged the SATA cables of my two 120Gb SSD drives from my motherboard and rebooted the PC. I won’t state the exact commands I used here, because that just failed. System wasn’t booting at all.

Fixing Ubuntu

Back to my Ubuntu live USB, after at least five attempts because my motherboard is apparently defective and misses the F8 key from time to time and the need to jump into Setup and change the boot order from there to boot the UEFI USB stick. Boot time with that Asus board is desperately long. Waiting 15 to 20 seconds from power up to boot loader is a shame when knowing it takes less than 1 second on a 300$ laptop! But the laptop lacks storage expandability I need, so I am always stuck on one end or another.

Then comes the fun part. I am pretty surprised there is no easier ways to restore GRUB than the following. I read about boot-repair, but it is just missing, probably yet another PPA to copy/paste and install. Anyway, I ended up getting it to work.

First I found the partition where Ubuntu was installed, /dev/sda5, and mounted it: sudo mkdir /media/ubuntu && sudo mount -t ext4 /dev/sda5 /media/ubuntu. I did the same with my ESP: sudo mkdir /media/efi && sudo mount -t vfat /dev/sda1 /media/efi.

Second step was to establish bindings:

sudo mount –rbind /dev /media/ubuntu/dev
sudo mount –rbind /proc /media/ubuntu/proc
sudo mount –rbind /sys /media/ubuntu/sys
sudo mount –rbind /media/efi /media/ubuntu/boot/efi

That caused some directories inside my Ubuntu mount to mirror exactly the top level directories.

Then I had to chroot into my Ubuntu, using

sudo chroot /media/ubuntu

After all this, system was behaving a bit the same way as if I started a shell on my Ubuntu setup. From this, I tried

sudo upgrade-grub2

That just updated GRUB’s entries, not the EFI one, so didn’t fix the boot.

Then I tried

sudo grub-install

If I remember well, no arguments were necessary, and that fixed my GRUB EFI and added back the Ubuntu entry to NVRAM. This worked only after /boot/efi was correctly referring to my ESP. Note however that for this to work fully, the USB live Ubuntu had to be booted in UEFI mode, not MBR default mode.

A reboot later, I was starting my Ubuntu setup, fully intact and working! Half of the transfer done! Not quite…

Windows was failing to boot and Ubuntu’s update-grub wasn’t detecting Windows anymore. Quite bad.

Windows: desperately dead

Windows, on the other hand, wasn’t booting at all. It was showing a blue screen suggesting me to use the repair tools from Windows DVD. Last time I did this, the tools ran for at least one minute and bailed out, so I had to do a complete refresh which ended up wiping everything and leaving only applications from the Windows store. If I have to choose between such a messed-up repair and a clean install, I would bet for the second option.

Before entering into this reinstall nightmare once again, I tried to recover the reserved partition. For this, I plugged back my Windows 120Gb SSD and booted from my live USB stick to make sure Windows would not kick in and see two copies of itself (one on the old, one on the new SSD). If Windows sees two copies of itself, it changes the disk ID of one copy. If the new drive is changed, everything is messed up and Windows cannot boot anymore, until a refresh is done (and then everything is messed up again!). Back to my live USB, I used DD to transfer the bytes of the old reserved partition to the new one. I also made sure the new /dev/sda2 reserved partition was marked as such in GParted, by modifying the flags. That changed nothing.

The post How to repair the EFI Bootloader in Windows 8 literally saved me hours of work! This gives a procedure that allows to fix the boot loader. The main idea is to log into console from Windows DVD and run bootrec /fixboot command from directory EFI\Microsoft\Boot\ of the ESP, followed by bcdboot  with a couple of arguments, again from the ESP. Luckily, I had my ultrabook, which was quite handy to check the page while I was running the commands on my primary PC.

That solved the issue and allowed me to boot into Windows 8.1! PHEW! Quite a nice step forward.

GRUB not detecting Windows

Now that my machine was able to boot into both Windows and Linux, one could wonder what was missing. Well, I had no easy way to choose which operating system to boot at startup. Originally, GRUB was offering me an option to boot into Windows or Ubuntu. After the transfer, it was only seeing Ubuntu.

I found procedures to manually add an entry for Windows but that involved finding and copy/pasting drive UUID and probably redoing the change on each kernel update. I didn’t want that. Another possibility was to install an alternative EFI boot loader like rEFInd, but these have tendency to display many unwanted icons doing nothing. I got enough trouble with this while fiddling with triple boot (Windows, Linux, Mac OS X).

There was absolutely no way out. People were doing the manual addition of Windows or that was working out of the box. I had to spend more than 45 minutes inspecting the os-prober script and walking through it! By looking at the script and its logs in /var/log/syslog, I manage to find out it was skipping my ESP because the partition was not flagged as Boot! I fixed that from GParted, reran sudo update-grub and tada! GRUB was seeing Windows!

This is NOT the end!

Then I had to proceed with the hardware installation of the new drive. Since I was too impatient to get a SSD, I ended up with an ill-designed system. If I had waited another year before purchasing my Core i7 PC, I would have got a superb case with support for SSD drives. Now I have a CoolerMaster kind of case with only standard 3.5″ drive bays and need to fiddle with SSD brackets. Screwing the SSD drive in this is a painful process of trial and error. Then the assembly doesn’t fit well with the screwless mechanism of the case. This somewhat holds in place, but that’s not smooth installation like a regular 3.5″ drive.

Some more fiddling later, my new SSD was plugged back into my PSU and motherboard, and I got rid of the extra two SATA cables. I stored them away; they will be useful sooner than later, because my two 120Gb SSD won’t remain unused.

I plan to put one of them into my HTPC, which will be another adventure of its own. My HTPC has only four SATA ports, all used up, so I will have to get rid of one hard drive.

Bumpy Android upgrade

I recently joined the club of unfortunate owners of Galaxy Nexus that reached the down path of death. Many people told me bad things about these Nexus and about other Android smartphones in general. My brother’s device is slow and for some obscure reason, mixed up the sounds altogether. As an example, the device emits the sound of a photo camera when locked and unlocked! My sister’s phone is slow like hell, putting her to the torture each time she opens up an application. One of my friend’s phone has no more mic; he has to leave headphones plugged all the times to answer calls. Another colleague at my work place had issues with the USB port: device was not charging anymore.

My problem is sporadic reboots, several times a day, and sometimes boot loops. I thought my phone was agonizing, but I found something that may give it a second life. I will have to see in the long run, but this was nevertheless an interesting adventure.

The symptoms of my Galaxy Nexus

This started a few months ago, on Thursday March 27, 2014. The phone entered into a boot loop and could not do anything other than rebooting like crazy. One of my colleague and friend managed to remove some applications in a hurry, before the next reboot, and that seemed to stabilize the monkey for a few minutes, but that just increased the length of the boot cycles. The device was rebooting like an old agonizing 486 computer overloaded with Windows 98! As a last resort, I tried a factory reset, which helped… until last week. Yes, the device started to reboot again!

I woke up on Thursday, July 24 2014, and noticed that my phone was stuck on the Google logo. Nothing would get it unblocked, except removing the battery and putting it back. I did it, rebooted the device and it got stuck again. Argghhhh!!! I removed the battery once more, left the device and battery on my desk and searched for some solution, to no avail, except in some cases, a bug in Android 4.2 was causing the phone to boot loop and it would unstuck after a few attempts. I put the battery back and tried again: this worked. Maybe removing the battery for a few minutes discharged some condensers and reset the hardware to a cleaner state, maybe I was lucky, maybe both. But the device remained unstable and was prone to reboot, sometimes twice in an hour. The Sunday after, I got fed up and made a factory reset, then I didn’t install any application until I find something longer term to fix the issue. The device then worked without any reboot, so an hardware defect is less likely, although still possible. I need to keep in mind I dropped the phone a couple of times, including once on my outdoor concrete balcony.

That means at least one installed application is interfering with the OS and causing it to reboot! This is unacceptable in a Linux environment where each process should be well isolated from the others and from the critical system components. A process should not have the possibility to reboot the device, unless it runs as root, but my device was not rooted, so no installed application could run a root process! That lead me to the conclusion that something in the OS itself was flawed, opening an exploit that can be used intentionally or not by applications to harm the device!

An average user cannot do much about that, other than refraining from installing any application, factory resetting the phone every now and then or contacting his phone service provider and getting whatever cheap replacement the provider will be kind enough to grant him until the end of his agreement. I didn’t want to hit the same wall as my brother and get something with a smaller display and bloated with branded applications. If I really have to get a new phone, that will be a Nexus free of crapware or, if I cannot get a Nexus, I am more and more ready to take a deep breath, give up on whatever I will need to give up and go for an iPhone.

First upgrade attempt: not so good

However, I had the power and will to do something more about this! This was a bit unfortunate for my spare time, my level of stress and maybe my device and warranty, but I felt I had to try it. If the OS has a flaw, why can’t I upgrade it to get rid of the flaw and go past this issue? Well, all Galaxy Nexus are not equal. US models have the Yakju firmware from Google, but Canadian models have a special firmware from Samsung instead! The Google firmware is the one that gets updated more often, up to Android 4.3. Samsung’s philosophy differs from Google: if you want to get an upgraded Android version, replace your phone.

That lead me to the next logical step: can I flash the Yakju firmware on my Canadian Galaxy Nexus phone? Any phone provider, any reseller, any technical support guy, will tell you no, but  searches on Google will tell you YES! For example, How to: Flash your Galaxy Nexus Takju or Yakju To Android 4.3 is the guide I started from.

First thing I had to do was to install Google’s Android SDK on my Windows 8.1 PC. Yep, you need the full blown SDK! The simplest solution is to get the Eclipse+SDK bundle, so at least you don’t have to mess around with the SDK Manager to get the full thing. Then I had to set up my PATH environment variable to get tools and platform-tools subdirectory into my path, so adb and fastboot would be accessible from the command line. I also had to download the Yakju firmware from Factory images for Nexus devices.

Second step is easy to forget when recalling the exact sequence I performed to reach my goal. It is as simple as plugging the phone into a USB port of a computer. That requires a USB cable and, of course, a free USB port. Any port will do, given it works. In doubt, test with a simple USB key.

Next step was to put my device in USB debugging mode. I searched and searched for developer options to no avail! Googling around, I found Android 4.2 Developer Mode.  Bottom line, I had to go into phone’s settings, tap on About Phone, then tap seven times on the Build Number! This is just shocking crazy: how was I supposed to find this out? Fortunately, after I unlocked the developer mode options, I was able to turn on USB debugging. Without USB debugging, ADB cannot communicate with the device.

This was necessary for a simple and nevertheless crucial step: running adb reboot bootloader. This reboots the device into the boot loader, a kind of minimal OS from which it is possible to flash stuff on the device’s memory. I read about procedures involving pressing power and volume up/down buttons, but that never worked for me. This is probably like booting the iPhone into DFU required to jailbreak or recover from very nasty failures: you have to watch tens of videos, try it fifty times and get it by luck once in a while. These kinds of patience games are getting on my nerves and making me mad enough to throw the phone away. Fortunately, adb reboot bootloader while device was plugged into my computer and in USB debugging mode did the trick.

Once in the bootloader, you can use Fastboot to interact with the minimal OS. As ADB, Fastboot comes with the Android SDK. However, Fastboot wasn’t working for me: I was stuck at “Waiting for device” prompt. I started Googling again and found awful things about a driver to download from obscure places and install, the driver may differ for Samsung devices with respect to other Nexus phones, I read upsetting stuff about driver not working for Windows 8 without a complicated tweak to disable driver signature validation, about rootkits that could simplify my life if I install yet another hundred of megabytes of applications onto my PC, etc. Flooded with all of this, I gave up and just let my phone run as is. Getting out of the bootloader is easy: just hit the power button and the phone will reboot as normal.

The Penguin saved the deal!

However, one week later, an idea was born in my mind, and it was urging me to be tested! Linux may have the needed driver builtin so it would be worth trying from my Ubuntu box. That’s what I did on Friday evening, August 1 2014, and it was a success after a couple of hurdles.

First, I had to install Android SDK there as well. Once adb and fastboot were accessible, I switched my phone into bootloader once again, using adb reboot bootloader.  Then I tried fastboot devices to get, again, this stupid “Waiting for devices” message. I don’t know exactly how I got to that point, but that command finally output me a message about permission denied. Ok, now I know what to do! sudo fastboot devices. Well no, cannot find fastboot! I had to stick the absolute path of fastboot for it to work, but I finally got a device ID. Yeah, the data path between my Ubuntu box and my phone was established!

Next incantation: sudo fastboot flash bootloader bootloader-maguro-primemd04.img. That gave me a failure, AGAIN! Ok, that’s great, my phone will definitely not accept commands from Fastboot! Maybe it is factory locked to deny these? But before thinking too much, I should have read the error message more carefully and completely. It was saying the following:

FAILED (remote: Bootloader Locked - Use "fastboot oem unlock" to Unlock)

It even gave the incantation needed to go one step further. I thus ran the command, prefixed with sudo. That popped a message on the phone’s screen asking me for confirmation. I moved the cursor to Yes with the volume up/down buttons, pressed power button and voilà, boot loader unlocked!

Why did I have to unlock the boot loader? This was probably because I was switching to a different kind of firmware. If I had a US phone, probably I would be able to install Yakju without unlocking the boot loader. The unlock operation is not without consequences: it wipes out all data on the device! This was a minor issue at this stage, since I refrained from installing anything and extensive configuration until I would find a way to improve the stability of my device. I thus wiped without asking myself any question about important data to back up.

Then with the similar feeling as a wizard gathering all the components to cast a spell, I entered the following command and looked at the output.

eric@Drake:/media/data/yakju$ sudo ~/android-sdk-linux/platform-tools/fastboot flash bootloader bootloader-maguro-primemd04.img 
sending 'bootloader' (2308 KB)...
OKAY [  0.258s]
writing 'bootloader'...
OKAY [  0.277s]
finished. total time: 0.535s

Victory! Not really… That was just the first step! Next step was to reboot the device, using sudo fastboot reboot-bootloader. My phone screen went black for a couple of seconds, enough to fear for an heart attack, then the boot loader came back again! Phew!

Ok, now the radio: sudo fastboot flash radio radio-maguro-i9250xxlj1.img. That went well, similar to the boot loader. Then I had to reboot again: sudo fastboot reboot-bootloader.

Now the main thing: sudo fastboot -w update image-yakju-jwr66y.zip. That took almost two minutes, then my device rebooted automatically, this time in the new firmware. Done!

After these manipulations, I was able to set up my phone normally. Once in the Android main screen, I accessed the phone settings and confirmed I was now on Android 4.3! At least I reached my goal.

What can I do next?

There are a couple of things I will try if the device starts rebooting again. Here they are.

  1. Install a custom ROM providing Android 4.4. Besides upgrade to latest Android, this will give me an extended battery life as 4.4 greatly improved over this as I experienced with my tablet, which benefited from a custom 4.4 ROM recently. I will also be able to return to baseline Yakju 4.3 if needed. Unfortunately, I had no way to back up my 4.2 firmware, so I cannot go back.
  2. Shop for a new phone. I will try to get a Nexus 5 and if I cannot without switching provider, I will shop for an iPhone. Maybe I will find a store in Montreal providing unlocked phones including Nexus, maybe I will have to wait patiently for my next trip to United States to buy an unlocked Nexus 5 there, maybe I will be able to convince someone from a US office of my company to buy the phone for me and ship it to me (if I ship him a check with the amount of the device obviously!), maybe I will find something to make me happy on a web site I don’t know about yet. We’ll see.
  3. If all else fails, I will give up on installing any application and will use the Galaxy Nexus just as a phone and for casual Internet access with the stock browser. After my agreement with Fido ends next November, I will consider other, hopefully better, options.