Categories
PoneyMac

External display issues

Quickly enough, I wanted to plug the Mac on an external display. First, I would get a bigger screen. Then I would be able to use a fully-functional keyboard with working F1-F10, Alt/Option and Escape keys, in front of my external monitor, rather than having to lean my head constantly towards the far-away laptop, to control it with an external keyboard/mouse.

Physical connection

I quickly found out the good new: that October 2006 MacBook Pro (discovered exact model using MacTracker this morning) has a real full-sized DVI output port! This is not the pseky mini-DVI requiring almost-Apple-specific adapters, even not the newest mini-DisplayPort. No, a real DVI port! Ok, I have a DVI to HDMI adapter for that!

IMG_20141207_170411

Well I realized with surprise that the DVI end of this adapter didn’t fit into the DVI port of the Mac. The exact reason why is still to be determined. However, I tried with a DVI to HDMI cable I had and the DVI end fitted. The DVI end of a DVI to DVI cable also fitted.

I was thus able to hook up the Macbook Pro to my Dell 23″ touch screen through DVI->HDMI. But the touch interface, provided by a separate USB connection, didn’t work, only the keyboard and mouse worked through the USB hub built into the display. I also successfully hooked up the Mac to my old 22″ LG LCD. Ideally, I would have put the machine and old LCD on a dedicated desk/table, with its own keyboard and mouse (ideally an Apple keyboard, if I can get my hands on one), but I don’t have a large enough table for this to fit comfortably.

The problem with my new Dell LCD is that connecting the cables is hard. The space behind is kind of too tight. I thus try to leave cables connected in the monitor’s end and just plug/unplug the device ends. There is one loose HDMI cable hanging on my desk. I can plug something directly if it has an HDMI output, or use an adapter to turn this HDMI into a mini-DisplayPort (really, I purchased this adapter for a Dell ultrabook provided by Nuance, the company where I am working), a DVI, a mini-HDMI or micro-HDMI.

The fact that the HDMI to DVI adapter didn’t work was kind of problematic for my monitor setup. I thought about several solutions:

  • An HDMI male to female cable. Most cables are male to male. It is a bit uncommon to have one female end.
  • A female to female HDMI coupler that would allow me to link to cables with an HDMI male end together.
  • An HDMI to DisplayPort cable. My Dell LCD has a DisplayPort port which is unused. I could try to use it instead of the HDMI port for connection with HDMI and mini-DisplayPort laptops. But I may have to purchase an additional mini-DisplayPort to DisplayPort adapter, not sure it would work with mini-DisplayPort/HDMI/DisplayPort.
  • A DVI to DisplayPort cable. May work, but I end up having to try my luck with another DVI end which may not fit into the Mac’s port, unless of course I purchase the damned cable twice the price at an Apple store!
  • A new LCD with easier to access connectors. I may have to sacrifice the touch ability, which is totally unacceptable!

Yesterday afternoon, I tried to get the HDMI male to female cable from a local computer store. I thought I had it, but looking further at it, I found out it was a somewhat weird HDMI to VGA cable. The ends are shown on the picture below.

IMG_20141206_173730

I was quite depressed and exasperated when I noticed that. How wiill I be able to get my hands on the needed cable if I cannot rely on anybody to help me out with finding it. I will have to wait forever for somebody sighted and versed into computer science to come with me and check, or try ordering the thing online.

But wait. I do have what I need!!! On my way to the gym yesterday, I remembered about the HDMI switch I was using with my old LCD monitor to duplicate the digital inputs (it has just one DVI and one VGA port). This is exactly the HDMI female to female coupler I needed, with just extra HDMI inputs I just don’t need for this particular uncommon use case. Back at home yesterday, I did the connection and it worked!

IMG_20141206_174043

Lack of flexibility

Quickly enough, I found out that Mac OS X would not display 1920×1080 to my LCD. It was matching with the internal display’s resolution. I went into the Monitors preferences and bumped up the resolution, but Mac OS X stubbornly added black vertical bars, truncating the image. This looks similar to picture below.

IMG_20141207_221607

I searched and found no solution, except closing the lid. Argh! That will prevent me from accessing the internal keyboard, if I need to change sound volume and the machine may become hot. Ok, let’s do it.

But that didn’t help! The laptop, instead of shutting off its internal display and sending video just to external, went into suspend mode! It was hooked up to AC power. It needs to be, because the battery exploded a while ago (before my brother’s girlfriend gave me the machine), so there is no battery at all!

IMG_20141207_221625

The first time I did this, after almost thirty seconds, my Dell LCD turned back on and I had 1080p.

IMG_20141207_221649

However, fonts were so tiny that I almost went mentally ill! Moreover, after that, internal display wouldn’t turn on again.

I don’t know exactly why, but I had to reboot the Mac later on, which reverted to the two-screen mode. Then closing the lid just suspended the Mac, not turning the external display anymore. However, it was a bit easier to work with the two displays, because the truncated display gave a bit larger fonts than fully 1080p display.

At some point, I reached the dead end with font size and finally lowered the resolution, but that didn’t help much. It just truncates the menu bar and things are not really bigger.

How about the “extended” display mode? Hitting F7 toggles between mirrored and extended modes. In mirrored mode, the two displays show the same thing. In extended mode, they act like a large screen. Extended mode didn’t work well for me, because the menu bar only showed up on the internal laptop LCD and each time I was trying to bring my lost mouse pointer to the top most corner, it was disappearing into the internal display!

The mystery of the closed lid

I think I found the way to have only external display. After the laptop suspended when lid is closed, I should have move the mouse or pressed a key on the keyboard, which would wake up the machine and force it to use just the external display. I read that somewhere during never-ending searches about other issues.

Getting back to internal display is a matter of disconnecting the external display, closing the lid and reopening it.

Categories
PoneyMac

Mouse issues

When I started exploring Final Cut Express on the Mac, I got quickly blocked by the mouse driving me totally crazy. With the Mac mouse coming with the machine, it was somewhat working, but the pointer moved very slowly. I had to move the mouse, lift it, move it again, lift it, five or six times, to bring the pointer where I wanted. This quickly became a real pain. The trackpad is working a bit better, but just a bit.

I tried with my PC Razer mouse in the hope I would get better results. The mouse worked, as opposed to the noname pointing device I tried on my Hackintosh last year, but the pointer was moving desperately slowly. I tried to tweak the pointer speed (there are buttons for that on my Razer mouse), but that made things worse! The pointer was moving slowly if I moved the mouse a bit, then jumping at a somewhat random place on the screen! It was almost impossible for me to use that!

Razer Synapse making my synapses mad

In an attempt to solve this, I tried installing the driver from Razer, Synapse 2.0. There is a Mac version. Installation went well, but after system restarted, I got an error message popping up about RzUpdater that crashed. I had to choice to Ignore, Report or Relaunch. Tired of unstable software, I didn’t hesitate to try the Report option. That sent a (probably useless) report to Apple, the dialog box dismissed, but it almost instantly reappared. I tried to dismiss with Ignore, with Relaunch, to no avail.

The message was always on top, covering other windows. I had to move it away at the bottom of the screen. It was always hanging, impossible to get rid of completely.

Then started a more than 45 minute Google search giving almost nothing except frustration. I got referred to an uninstaller program, found it using Spotlight, ran it, but it said to be incompatible with my version of Mac OS X. After a few searches and attempts, I found out that RzUpdater and RzEngine wouldn’t start on my version of Mac OS X, and moreover, uninstaller wouldn’t start as well! So I was stuck with no solution.

I then searched about cleaning up the startup applications and found this. This gave me something that allowed to “repair” my system. The /Library/StartupItems and /System/Library/StartupItems folders were empty, which is perfectly normal, /Library/LaunchDaemons folder didn’t contain any Razer-related stuff. But /Library/LauchAgents contained two Razer-related files, one about RzUpdater, one about RzEngine. Resisting the temptation of removing these files without any precaution, I made a backup copy, then I proceeded with the removal.

I don’t know if and how this can be done from the Finder. I was tired of the GUI and opened a Terminal to do the surgery from the Bash shell I know about and like far better than a GUI not working well with the keyboard. A plain old rm didn’t do the trick: permission denied. But sudo rm worked; I had to provide the login password of the account I was logged on. The damned RzUpdater message came back after I dismissed it for the maybe 50th time, but one reboot later, it was gone for good.

SmoothMouse not so smooth

After this miserable failure with Razer Synapse (almost bloated the Mac forever and drove me nut!), I tried more searches and found a free solution. My new hope: SmoothMouse! Happy to find something promising, I tried to download and install this. Unfortunately, that didn’t start at all: the tool was incompatible with Mac OS X 10.5, works only with 10.6.8 or above.

I could try with SteerMouse, ControllerMate or USB Overdrive, but all of these three cost 20$ to 30$ US. Without the Mac App Store (only in 10.6, again!), I would have to stick my credit card yet another place and be charged an undetermined amount of money since I am unlucky and live in Canada, not in US. I am more and more frustrated by all these artificial complications. Mouse support should be built into the OS, not require a third party tool that needs to be purchased separately. Moreover, I feel that no matter which one I’ll pick, I’ll get a road block later on and will have to switch. If I was sure that one of the three tools would solve my mouse problem once and for all, and no matter what future mouse I stick into the USB port of this apparently dommed MacBook Pro, I would be glad to pay 20$ to a person that would have saved me!

An unexpected improvement

Feeling more and more likely to end up giving this machine back to my brother’s girlfriend, I decided to isolate my connections to personal profiles (Google, Facebook, Apple ID) in a dedicated account, rather than continuing to log in with her account.  The process of creating a new account was easy and worked flawlessly. I then logged in with my new account and found out that the mouse was working a bit better. While not perfect, it is usable, so this is less of an issue than before. Moreover, the creation of the new account allowed me to start fresh with a cleaner, less cluterred dock.

However, I am finding more and more programs that cannot install on obsolete, not supported anymore, Mac OS X 10.5. This is starting to be a road block, near a show stopper. This may end up my adventure prematurely, unless I switch gear and come back to my early Hackintosh-based installation!

Categories
PoneyMac

Defective keys and erratic keyboard shortcuts

One of the first thing I attempted to do on the Mac was to turn on full zoom and enlarge mouse pointer. This can be done from the System Preferences (Apple menu), Universal Access icon. Zoom was set to No, so I changed it to Yes.

Image 5

As a side note, the Mouse and trackpad tab offers a neat way to enlarge the mouse pointer. This is one of the greatest Mac OS X feature.

Image 6

The Sight tab is giving the keyboard shortcut to zoom and unzoom, unfortunately in a quite cryptic and hard to remember way. Instead of putting key names such as Command, Option, Shift, etc., Apple decided to use icons that make no sense. This is a real pain for me, because I have trouble associating these images with keys. But after some time, I figured out that the key combination to zoom was Command (key marked with an apple just left to the space bar), Option (key just left to the Command key) and =. However, hitting that key combination just did nothing. I tried several times, without success. I also tried Command-Option-8 many times to make sure that zoom was effectively enabled.

Starting to suspect the key wasn’t responding physically, I started a Terminal and launched xev, an utility I know of from my UNIX/Linux background. xev listens to events and displays the name X receives when the event occurs. This is a way to figure out what is generated when a key is pressed, a mouse button is used, etc. Pressing on Command, Shift, =, and some other keys, had an effet. However, pressing on Option did… nothing.

Ok, maybe some misbehaved application was intercepting the Option keys. To iron this out, I wanted to boot this machine off an Ubuntu USB stick. The main idea to achieve that is to insert the stick and press Option at boot, just after hearing the chime. No matter how much times I tried this, it had no effect.

I finally tested with an external USB keyboard, but that was a PC keyboard, since I don’t have a spare Mac USB keyboard hanging around in my house. This is not ideal, because some keys are misplaced and quite confusing. In particular, to get the Command key on the PC keyboard, I have to press the Windows key. Option is mapped to Alt. This knowledge in mind, acquired from my experience with a Hackintosh, I tried Windows-Alt-= and got the zoom! Ok, so Mac OS X is processing Option key, but the builtin keyboard is not capable of producing it.

The only way I can zoom in with just the built in keyboard is by hitting Shift and scrolling with two fingers on the trackpad.

Note that I successfully got the EFI boot menu by turning on the Mac while hooked to an external keyboard and hitting Alt key after the chime. However, I had to repeat the manipulation five or six times until it succeeded. That remembered me of the memorable trouble I got while trying to jailbreak my iPod after upgrade to iOS 4; entering DFU was hard and no instruction given on the Internet was working. I had to use completely different timings than one given on web sites! Seems Apple likes these hard to guess procedures, to be alone able to debug things, but that excludes cases of old out-of-warranty devices!

But that’s not the end of the story. Quick enough, I wanted to open up the main menu bar without using the mouse. I know there is a way: CTRL-F1 to enable universal keyboard access, then CTRL-F2. This again comes from my Hackintosh experience. So I tried… with no luck.

After a little trial and error, I found out that F1 through F10 don’t produce F1 through F10 but rather special behavior, like adjusting LCD brightness, volume, keyboard backlight, etc. Correct, no problem with that, if I can get F1 through F10 another way. From previous intuitions with laptops, I figured out that fn key combined with F2 would do the trick, so Ctrl-Fn-F2 would open up the menu bar. Well no! Fn key seems defective and not responding as well, same for Escape!

Any Google search about this got me absolutely no positive results. It seems I would have to replace the keyboard, which would involve purchasing a completely new chassis, disassembling the MacBook Pro and reassembling it in the new chassis. This is just a non-sense and too costly job. I’m not ready, nor technically, nor mentally, to engage into such an operation. I could do it on a desktop PC if I had to, because I know how they are put together, but I have no knowledge, no mental model, about how the components of MacBook Pro are put together.

Lack of the fn key also prevents the production of the Home, End, Page Up and Page Down keys by combining Fn with the arrow keys. I will probably discover other impossible to obtain keys. For now, it is far better to hook up an external USB keyboard to this machine, especially to be in front on a larger external monitor when using it.

Even with an external USB keyboard, some shortcuts don’t always respond. For example, Ctrl-F2 doesn’t always open the menu. It seems that when the machine is busy, it just happily skips the interpretation of the shortcut! Sometimes, the machine produces an annoying beep when I hit Ctrl-F2 instead of popping up the menu. Combined with other issues, this sometimes got me mad, almost drove me crazy!

The use of a PC keyboard complicates things. To get a /, I have to combine right Alt with é key. Same right alt acrobatics are needed for ~, {, }, [, ], |, etc. The ù is obtained by the key just left to the number 1 while on the MacBook Pro keyboard, it is left to the Z key.

Some keyboard shortcuts I am used to just don’t work: Windows-Tab instead of Alt-Tab, Windows-L instead of Ctrl-L, etc. The Home key, rather than going at the beginning of the current line, jumps at the beginning of the file, forcing me to return at the point I was while editing. This happens for documents in text processors as well as text edited on web interface such as WordPress. This is minor compared to problems caused by defective keys or shortcuts not always responding, but this adds up to make a really bad and extremely frustrating user experience.

 

 

Categories
PoneyMac

Screen becoming black sporadically

I often have trouble tracking the mouse pointer on the screen, especially the default small one. When I loose track of it, I bring it back at the top left corner of the screen. The first time I did that on the Mac, everything became black. Even if I was moving the mouse pointer or hitting a key, nothing was happening. Screen reappared after a few seconds. This happened a few times, but the other times, it was possible to leave this mode by just moving the mouse.

I figured out two things. First, the screen would become black every time I bring the mouse pointer at the top left corner. Second, the machine was so overloaded that at start, things didn’t respond properly, even exiting from this black mode.

I looked into the system preferences for a solution about this. The preferences are accessible from the Apple menu, available from all applications.

Image 2

I found something quite interesting in preferences for Exposé and spaces: configurable hot corners.

Image 4

Top left corner was configured to trigger the screen saver! AH! I clicked on this and that popped a menu offering me different options, including one to disable this unwanted (at least for me) behavior.

Image 3

In conclusion, not a bug, just a feature. At least, this is configurable, as opposed to intrusive GNOME3 hot corners which cannot be tweaked from any GUI, at least in the versions I tried.

Categories
PoneyMac

An old Mac = old and new problems

This is the first post of what I expect to be a very long saga. I don’t know exactly what I will acquire during this venture, but I am tempted to engage in it because of my curiosity and taste for exploration.  Will I get more patience, experience with Final Cut Express, with Mac OS X, maybe, maybe not. For now, the only thing I am getting is trouble. It seems that nothing works as expected. Keyboard shortcuts randomly fail, mouse moves erratically, scrolling in web pages is slow and choppy and every, I say EVERY, web page shows up with a tiny font. I have to hit Command-+ more than 15 times to get a zoom level I am comfortable with.

This Mac comes from my brother’s girlfriend. Since she is nicknamed Poney, I decided to call this machine the PoneyMac for the time being. When I consider this beast as tamed, it will symbocally become mine and I may assign it a new name of my own creation, but for now, this is The Mac, not my Mac.

This is a 2006 MacBook Pro with 1Gb of RAM and an hard drive of I don’t know yet exact size.  The machine runs Mac OS X 10.5 Leopard, unfortunately a 32 bits version. I’m not yet sure if the EFI is capable of booting 64 bits OSes, maybe not, which is quite bad.

All future posts of this saga will get the PoneyMac category, with tags corresponding to the topic.

Categories
Bug

Another upgrade that breaks things down

When I tried to write a WordPress post today, I found out that I couldn’t enter anything in the main text edit area. I tried several times, saved the draft without any success and then found out I would be blocked, unless I AGAIN try with a different browser. I just cannot continue like this if now I have to leave several browsers open and switch from one to the other. This just breaks all possibilities of efficiently switching between windows. I would end up with different similar-looking windows, with no way to quickly distinguish them, as opposed to one browser window with tabs.

A Google search about this lead me to a forum post. Some people were experiencing similar issues, again after a WordPress upgrade! Some people manually reinstalled an old version without success, but god, I cannot manually install things on my dumb HostPapa account, because I have NO SSH access! I am just exhausted of facing the same problems I cannot resolve unless I completely switch gears, reformat, start over, etc.

A forum post suggested to disable all extensions, clear the browser cache and switch to the default theme. I tried the cache: no success. I never changed the theme, so I am probably using the default. I then had to disable extensions. The culprit was TablePress.

But the fundamental question is: what’s the point of having any extension if they constantly need to be turned off for one reason or another?

I’m now stuck with an unstable content management system whose extensions are broken and there is no way to improve over this without manually moving everything to something else.

Categories
Bug Configuration

Cannot transfer files anymore from my Galaxy Nexus through USB

Last friday, I tried to hook up my Galaxy Nexus phone through USB to transfer some files to my computer. After a few seconds, a completely empty Nautilus window appeared. Once again, Ubuntu was incapable of detecting my device. This happened a few versions ago and I had to use obscure and impossible to remember MTP commands to perform transfers. I didn’t want to search for these commands again, tired of loosing more and more time with artificial problems. If Ubuntu degrades at this point from one version to the other, I will be better off switching to Windows and install Linux only in virtual machines. This is not the only issue I got and most bugs (mouse pointer, Emacs, M-Audio sound stability) persist with version upgrades. Canonical now seems to focus on Mir and newer Unity versions, which I really dislike because Mir will break everything down for five or six versions. Either this, or Canonical will cut corner on keyboard accessibility, resulting in a UI that will be almost unusable for me. I exptect this will  be my hardest Linux time ever when that beast comes out, until it stabilizes.

However, my actual bug was worse than this: the phone didn’t connect through Windows as well! When I plugged in my phone on a Windows machine, an empty Explorer window similar to below comes up and nothing else. Does that mean my phone is dying, progressively loosing functionalities? Probably.

Capture d'écran 2014-11-30 09.04.15

I didn’t atempt any Google search about this. It is worthless. I will find forum posts about people replacing the cable, doing factory resets, sending their phone for repair or replacement, etc. The phone hooks up, something is detected by the computer, so why a new USB cable would help? Yes, I can do factory resets after factory resets, that will probably fix it, but what’s the point of doing this if I know I will have to redo it a few months later, for no reason, unless I install NOTHING on the phone? And how I am getting through technical support are no-go solutions or offers for a new phone that will force me to switch to a more expensive plan with my provider or buy something from nowhere with an old version of Android.

Before accepting this conclusion and starting to look if I could get a Google Nexus phone from Google rather than going through Fido and affecting my phone/data plan or get something somewhat correct from DX.com, I tried with a different cable: same effect. Then I saw the home screen on the device and remembered I set up a PIN recently to protect my Google account for any tampering by somebody who may get back my phone if I loose it, or steal it from me.

Screenshot_2014-11-30-09-03-22

I entered my PIN and saw with surprise and relief the following window on my computer:

Capture d'écran 2014-11-30 09.05.34

Tada! This time, the solution was simple! This was just a normal data protection! So my USB connection is still working!

Note that locking the phone doesn’t shut off the USB access, until the cable is disconnected. The PIN also doesn’t prevent me from answering a call, so this is not as problematic as I feared it would be.

Categories
Bug

Firefox Sync duplicating bookmarks

A few months ago, there was an update to Firefox Sync. The system was completely changed, requiring the creation of a new account and the reupload of all bookmarks, history information, etc. The old system was still functional, but it wasn’t possible anymore to add devices.

Despite my frustration (yet another account to create), I registered with the new system and started removing my other instances of Firefox from the old system, to attach them to the new. I have a couple of such instances: one on the Windows side of my personal computer, one on its Ubuntu side, one on my personal ultrabook, one on my HTPC, another one on my company’s laptop, another instance on my company’s ultrabook, and three on different virtual machines! That seemed to go pretty well, just uselessly time consuming.

However, a few days later, I noticed it became almost impossible to find something in my bookmarks. Some bookmarks disappeared, but looking a bit deeper, I found multiple copies of bookmarks as well as folders of bookmarks. This was a total mess. It would have taken me hours and hours to clean this up. I though about exporting the mess to JSON and writing myself a script to clean it up and restore with the processed copy, but I wasn’t feeling at working on this during my spare time. I thus left this alone and almost stopped using bookmarks. Sometimes, typing text in the URL finds stuff on Google, in bookmarks or in history. Sometimes, I was getting the link from within an email. Other times, I had to open up the bookmark manager and search endlessly.

Last week, I decide to try something about this: simply restore a backup copy of my bookmarks. I thought about an old copy I had done in January after a clean up of the bookmarks, then I found that Firefox was proposing me options to restore bookmarks saved periodically. I took the latest day with 400k of bookmarks vs 700k, and that did the trick!

Unfortunately, a week later (yesterday in fact), I found out that duplication started again! So it seems that Firefox Sync now simply makes one copy of the bookmarks per instance of Firefox it syncs with! Why not, in this case, offer these copies in separate folders, so at least one would know what to expect?

I searched for a long time to find a better way to manage my bookmarks, and everyone is adopting its own inconvenient or outdated solution. Some are using iCloud with Safari to sync bookmarks and manually transferring to Firefox, others are proposing Xmarks which was discontinued years ago while its web page still offers the tool (last time I tried it, it couldn’t sync, just sit there and try to connect to a server), others are adopting EverSync, others swear that Delicious is the best, etc. It seems I would have to choose one of these and be prepared to start over research a few months later and find the one that would replace my choice which would go down again.

I tried to restore backup once more. Maybe there is a problem with my SQLite Firefox DB on Windows 8.1; this Windows 8 box is more and more flooded with crap and would require a reformat/downgrade to Windows 7, which I just don’t want to do. Maybe, I thought, if I restore the backup on the Windows 8 box, the DB will be clean, and that would sync up with the rest. I tried, then went on my HTPC to see if things would fix. No result. I found out that my HTPC Firefox was still using the old sync, so I updated it.

Then duplicate bookmarks came back again! I got fed up and removed all my bookmarks. There is no point in having bookmarks if Firefox Sync copies it once per machine it connects to, and I just cannot get rid of all my computers except one, otherwise that will have to be my company’s laptop and that will end any possibility of playing Minecraft, attempting to compose music with Ableton Live, and I don’t feel comfortable leaving a copy of my personal data, including my diary, on my company’s laptop! And I would have to do this just because of Firefox? No!!!

I was about to switch to another strategy consisting of using Evernote to store links. There is a capture tool, the Clipper, that allows to make a copy of a web page, with its original link, so that can act as a kind of bookmarking system. This is at least better than the poor man’s system I was considering more and more, consisting of writing a plain old HTML page with links, uploading it to my HostPapa account and updating it from time to time.

But today, I found out that multiple Firefox instances got the same name. My Windows and Ubuntu installation share the same host name since they don’t run simultaneously, and I am using the same login name, so Firefox gave the same default name to the computers. This may explain why it got mixed up with bookmarks! Moreover, update from old to new system may result in additional bookmark duplication.

With this hypothesis in mind, I started all my Firefox instances, one by one, and verified that they all have different names, and that the removal of bookmarks correctly spread. At this point, all my Firefox instances got their bookmarks removed, but I need to be extra sure there is no left over instance I forgot that could restart the duplication like a virus. While doing that, I felt a bit like Jean-Luc Picard going from one version of the Enterprise to the other, in different time periods, to repair a temporal anomaly. Of course, my bookmark problem is far less dangerous than this!

I don’t know yet if that will allow me to put back my bookmarks in one instance of Firefox and see them reappear elsewhere, without duplication.

Categories
Configuration

One SSD for my HTPC

A bit more than a month ago, I successfully transferred my dual boot Windows 8.1 and Ubuntu 14.04 from two 120Gb Solid State Drives (SSD) to a single 240Gb drive.  I got several problems restoring the bootloaders of the two operating systems, thought many times I would have to reinstall, then figured out a way to make them boot.

But what happened to the two drives I removed from my main computer? Well, they sat still on the top of a shelf. But at least one drive will be repurposed: become part of A.R.D.-NAS, my HTPC. Sunday, October 26 2014, I finally got the time and courage to undertake the transfer operation. This time, the software part was pretty smooth, but the hardware part was a uselessly intricate puzzle. During the process, I wondered myself many times about the purpose of generic hardware if it doesn’t fit well together and pestered about the lack of any viable alternatives.

The sacrifice

Well, my NMedia HTPC case has six 3.5″ drive bays. This is quite nice for an HTPC case. This is possible, because I chose an ATX case, to get a motherboard with rear S/PDIF audio connectors rather than just headers accepting brackets I could get nowhere. This case is a bit bulky; I would build off a MicroATX case if I had to start from scratch.

So installing this SSD seemed obvious at start: just add the drive, transfer the Linux boot partition from the hard drive to SSD, remove the original boot partition, setup GRUB on SSD and tada. No, things are rarely as simple. I thought my motherboard had only 4 SATA ports, and they were all used: one 1Tb hard drive, a second 1.5Tb hard drive, a third 3Tb hard drive, then a blu-ray reader/writer. Why so many hard drives? Well, I am ripping and storing all my video disks, even the huge blu-rays, to avoid the need for searching for them on shelves.

Even if I remembered correctly I had six ports on the motherboard (two are free!), my PSU only had four SATA power connectors, so I would not be able to easily and reliably connect all my drives. I could try to find some splitter cables or molex to SATA adapters, but that would add a factor of failure. I could replace my PSU for one with more SATA power cables, but it would also have more molex cables, PCI Express connectors, etc. Unless I went with a more expensive modular PSU, all these cables would have cluttered my case.

Safest and cheapest solution was to sacrifice one of the hard drives, the 1Tb one of course, the smallest. I thus had to move files around to have less than 120Gb of stuff on the hard drive that would be moved to SSD. That process took a lot longer than I thought. My poor HTPC spent the whole Saturday afternoon copying files around! Fortunately, this is machine time so I had plenty of time to experiment music creation with my still new UltraNova synthesizer combined with Ableton’s Live multri-track ability.

Preparation

On Sunday, I first burned the Ubuntu 14.04 ISO on a DVD. Yes, Ubuntu is now large enough to fit only on a DVD. After that, I shut down my Minecraft server running on my HTPC and moved its files to another old PC. I started the server on the old PC and reconfigured the port mapping on my router. This way, if my friend wanted to kill a bit of zombies and creepers while I was installing my SSD, he would be able to do so and I would not be stressed if something bad made my HTPC out of service (like something stuck in the CPU fan breaking it).

I then removed the cover of my HTPC and spent quite a bit of time trying to figure out what was the 1Tb hard drive. Based on the position of the SATA connector on the motherboard, I presumed that was the left-most drive. I thus had to disconnect the drive in the middle bay, and use the freed up power and data connectors to hook up my SSD. I then booted up the machine.

Following picture shows the drive temporarily hooked up.

DSC03934

Then I remembered about my old 22″ LCD that I stopped using after purchasing my Dell touch screen. I went pick it up in my computer room, put it on my kitchen table and plugged it in. This way, I would be able to have the screen right in front of me, keyboard on the table, rather than in front of my 46″ LCD HDTV with the keyboard on my knees.

The SSD and the LCD hooked up, I booted up my HTPC and quickly sticked the Ubuntu DVD in the blu-ray drive. After an incredible amount of time, the machine finally booted up into Live Ubuntu DVD!

Data transfer

After Ubuntu started, I launched GParted and realized that I chose the wrong hard drive. The 1Tb drive containing my Ubuntu setup was disconnected. Oh no! So that means I will have to turn off the machine, connect the right drive and wait once again for this stupidly long, almost five minute, DVD-based boot? No, not this time! Feeling like a cowboy, I decided to try something: drive hot swapping. This is possible with SATA, so let’s see! I thus disconnected the 1.5Tb hard drive, starting with the SATA data cable, then the power cord, then hooked up the 1Tb drive. Hearing the old hard drive coming back to life was kind of thrilling. Everything went well, no cable stuck into my CPU or rear fans, and the PC didn’t freeze like it would do with IDE. The hot swap worked.

After that, this was relatively straightforward. As with my main PC, I used GParted to transfer my Linux boot partition and reconstruct the layout. I fortunately remembered, before, to reset the partition table. If I didn’t do that, the GPT that was on my SSD would have caused booting issues that would have drove me mad! I would probably have ended up reinstalling everything, angry against Ubuntu, the technology and probably the whole human kind. A single step, recreate the msdos partition table from GParted before the transfer, saved me that!

Following picture shows my LCD on which we can see the progress of the transfer.

DSC03935

See how bulky was this setup: HTPC on the floor, case opened, SSD hanging on top. Hopefully it was possible to make this setup clean once again after all this.

DSC03936

The home partition: too big to fit on the SSD

Unfortunately, GParted didn’t want to transfer my home partition to the SSD, because it was obviously too large. I could have shrunk it in order to copy it, but I wanted to avoid altering the hard drive in case something bad happened. I thus instructed GParted to simply create a blank Ext4 partition and used cp to perform the copy. The following terminal session shows how I managed to do it in such a way that all files metadata (timestamps, ownership, permissions) was preserved.

ubuntu@ubuntu:~$ mkdir /media/old-home
mkdir: cannot create directory ‘/media/old-home’: Permission denied
ubuntu@ubuntu:~$ sudo mkdir /media/old-home
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sda

Disk /dev/sda: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000e2c4d

   Device Boot      Start         End      Blocks   Id  System
/dev/sda1   *          63    40965749    20482843+  83  Linux
/dev/sda2        40965750    81931499    20482875   83  Linux
/dev/sda3        81931500  1953520064   935794282+   5  Extended
/dev/sda5        81931563  1943286659   930677548+  83  Linux
/dev/sda6      1943286723  1953520064     5116671   82  Linux swap / Solaris
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sda5 /media/old-home/
ubuntu@ubuntu:~$ ls /media/old-home/
eric  lost+found  mythtv
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sdg

Disk /dev/sdg: 120.0 GB, 120034123776 bytes
255 heads, 63 sectors/track, 14593 cylinders, total 234441648 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000d9a0c

   Device Boot      Start         End      Blocks   Id  System
/dev/sdg1            2048    40968191    20483072   83  Linux
/dev/sdg2        40968192    81934335    20483072   83  Linux
/dev/sdg3        81934336   234440703    76253184    5  Extended
/dev/sdg5        81936384    92170239     5116928   82  Linux swap / Solaris
/dev/sdg6        92172288   234440703    71134208   83  Linux
ubuntu@ubuntu:~$ sudo mkdir /media/new-home
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg6 /media/new-home/
ubuntu@ubuntu:~$ sudo cp -a /media/old-home/* /media/new-home
ubuntu@ubuntu:~$ ls -a /media/new-home/ -l
total 36
drwxr-xr-x  5 root root  4096 Oct 26 19:49 .
drwxr-xr-x  1 root root   100 Oct 26 19:44 ..
drwxr-xr-x 69 1000 1000 12288 Oct 25 22:52 eric
drwx------  2 root root 16384 Sep 26  2009 lost+found
drwxr-xr-x  3  122  130  4096 Jan 24  2011 mythtv

The main idea is to mount both the old and new partitions, then use cp with -a option and root access (with sudo) in order to preserve everything. The operation went smoothly.

The boot loader

Even after copying all Ubuntu-related data from my old hard drive, my SSD was still not bootable. To make booting off the SSD possible, I had to install GRUB. Unfortunately, reinstalling GRUB on Ubuntu is not as simple as it should be. If there is a package doing it, why isn’t it built into Ubuntu’s image? Maybe because for most setups, reinstalling from scratch takes 15 minutes. That’s true, but then how about tweaks to fix mouse pointer too small, make XBMC work with S/PDIF sound, reinstall MakeMKV, etc.? Each step is simple, at least when no unexpected difficulty creeps in, but the sum of small things to tweak makes it long.

So let’s avoid this by running the following!

ubuntu@ubuntu:~$ sudo mkdir /media/ubuntu
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg1 /media/ubuntu/
ubuntu@ubuntu:~$ sudo mount --rbind /dev /media/ubuntu/dev
ubuntu@ubuntu:~$ sudo mount --rbind /sys /media/ubuntu/sys
ubuntu@ubuntu:~$ sudo mount --rbind /proc /media/ubuntu/proc
ubuntu@ubuntu:~$ sudo chroot /media/ubuntu
root@ubuntu:/# grub-install /dev/sdg
Installing for i386-pc platform.
Installation finished. No error reported.
root@ubuntu:/# update-grub
Generating grub configuration file ...
Warning: Setting GRUB_TIMEOUT to a non-zero value when GRUB_HIDDEN_TIMEOUT is set is no longer supported.
Found linux image: /boot/vmlinuz-3.13.0-37-generic
Found initrd image: /boot/initrd.img-3.13.0-37-generic
Found linux image: /boot/vmlinuz-3.13.0-36-generic
Found initrd image: /boot/initrd.img-3.13.0-36-generic
Found linux image: /boot/vmlinuz-3.13.0-35-generic
Found initrd image: /boot/initrd.img-3.13.0-35-generic
Found linux image: /boot/vmlinuz-3.2.0-61-generic
Found initrd image: /boot/initrd.img-3.2.0-61-generic
Found linux image: /boot/vmlinuz-3.0.0-17-generic
Found initrd image: /boot/initrd.img-3.0.0-17-generic
Found linux image: /boot/vmlinuz-2.6.38-12-generic
Found initrd image: /boot/initrd.img-2.6.38-12-generic
Found linux image: /boot/vmlinuz-2.6.32-25-generic
Found initrd image: /boot/initrd.img-2.6.32-25-generic
Found linux image: /boot/vmlinuz-2.6.31-21-generic
Found initrd image: /boot/initrd.img-2.6.31-21-generic
Found linux image: /boot/vmlinuz-2.6.28-16-generic
Found initrd image: /boot/initrd.img-2.6.28-16-generic
Found memtest86+ image: /boot/memtest86+.elf
Found memtest86+ image: /boot/memtest86+.bin
Found Ubuntu 14.04.1 LTS (14.04) on /dev/sda1
done

The main idea here is to create a chroot environment similar to my regular Ubuntu setup, then install GRUB from here. I ran update-grub to make sure any disk identifier would be updated, pointing to the SSD rather than the old hard drive. Unfortunately, a small glitch happened: update-grub detected the Ubuntu setup on my hard drive. To get rid of this, I had to unmount the old hard drive and unplug it! After rerunning update-grub, I got the correct configuration.

Updating mount points

Since I rebuilt the home partition rather than copying it, its UUID changed so I had to update the mount point in /etc/fstab. I thus had to run the following:

root@ubuntu:/# cd /dev/disk/by-uuid/
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg6
lrwxrwxrwx 1 root root 10 Oct 26 15:37 fb543fcb-908a-463d-bc1f-896f1892e3ad -> ../../sdg6
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg1
lrwxrwxrwx 1 root root 10 Oct 26 16:11 54f4cbd6-aed0-4b43-91c0-2f8d866f3ee3 -> ../../sdg1

After I figured out the UUIDs, I had to open up /media/ubuntu/etc/fstab with gedit and make sure the mount points were correct. I only had to update the UUID for the /home partition.

Test boot

After all these preparatory steps, it was time for a test! I thus powered off the computer and made sure my old 1Tb hard drive was unplugged and my new SSD was hooked up. I turned on the PC and waited for the forever lasting BIOS POST. Why is it so long to boot a desktop while a laptop almost instantly hands off control to OS? After BIOS handed off control to OS, I got a blank screen with a blinking cursor, nothing else. I tried a second time: same result.

So after all these efforts, do I really have to format and reinstall from scratch? It seems so. Before doing that, I rebooted my machine once again and entered into BIOS setup by hitting the DEL key. Once there, I looked at the hard drives and found that the SSD was hooked up but it was not at SATA port 0.

I turned off the machine and connected the drive into a different port, what seemed to be the first. Looking into the BIOS setup again, my SSD was now at port 0. Ok, let’s try that a last time!

After a blank screen lasting too many seconds for a SSD boot and making me fear for a frustrating reinstall, the Ubuntu logo appeared, and my desktop finally came up! A quick check confirmed me that all the hard drives were present, except of course the disconnected 1Tb one. The SSD was ready to be installed into the machine!

The hardware part

The downside of SSD is that they seem not to fit in any regular desktop cases, only in laptops! This is a very frustrating limitation. Why are these drives all 2.5″ or why cases don’t have 2.5″ bays? When I shopped for my computer case, only high end ones had the 2.5″ bays and that was coming with fancy mechanisms to make the drive removable without plugging any cables, something adding into problem factors. Maybe at the time I am writing this post, some cases with SSD bays are available, but that doesn’t matter; I won’t change my case unless I really need to!

Before installing the SSD, I first removed that old 1Tb drive. I just had to remove four screws from my drive cage and slide the drive out.

DSC03938

DSC03939

To help me install my SSD into my HTPC case, I had a bunch of screws as well as an OCZ bay bracket. Just screwing the drive into the adapter’s tray took me forever, because I had trouble finding screws that fitted, there was a screw in one of the SSD hole I don’t know exactly why and that took me almost five minutes to realize. I then had trouble aligning the screws with the hole, was getting more and more tired and prone to drop screws, etc. At least, the screw I dropped fell on my table so I didn’t have to search it on the floor forever.

Following picture shows the drive in the bracket.

DSC03937

Then I had to screw that assembly into the drive cage of my case. Unfortunately, the upper bays of the cage only offer bottom holes while the SSD adapter has only side screws! I thus had to screw the adapter in one of the bottom bays, which are definitely suited for hard drives with their rubber pads to absorb vibration. None of my screws fitted well. It seems that the SSD adapter has holes smaller than normal while the screws for the drive bay are larger than usual! I got it after more than 15 minutes of attempts. I thought many times I would have to postpone this job and wait for my father to come by with a drill and make some new holes into the SSD bracket or the case.

Following picture shows the drive in the cage.

DSC03941

I don’t know exactly how much time I spent on this installation, but at the end, I was tired and was asking myself if all this would be worth it in the end.

Well after the SSD was screwed and the drive cage back into my HTPC case, I realized I wouldn’t be able to hook up my four SATA drives! No matter what I tried, there was always one drive lacking power. This was because the SATA cable coming out of my power supply unit were too short to accommodate the drive layout I came up with! Ok, I’m at a dead end now.

Before giving up and bringing that beast to a computer store in the hope they would figure out a way to hook the four drives up (maybe with some extension cable I don’t have, or using a new PSU), I remembered that the 1Tb drive I removed was in the middle upper bay which was now empty. My only hope to get the drive powered this day was thus to move one of my hard drives there. Ok, so let’s remove the cage again and play with the screwdriver once more!

I moved my 3Tb drive from the side bay to the upper one and put the drive cage back into my case. I was then able to hook up power. Reaching the SSD drive in the side bay to hook up SATA cables was a bit tricky, but I finally got it. A last check confirmed that all my drives were hooked up, except my blu-ray writer. Ok, just a cable to plug in, and that was it!

Was this all worth it?

After all this hassle, I asked myself this question. When I booted up the machine, it seemed as slow as with the hard drive. What? Maybe the CPU is too slow, after all. But when I reached the desktop and started XBMC, I felt the system was more responsive.

More importantly, the machine became a lot more silent. Since a few weeks, this HTPC was making a lot of noise. I thought it was the CPU fan stressed out by the Minecraft server running on the system, but the 1Tb hard drive was contributing to the noise as well. I suspect it was emitting more and more heat, causing the temperature to raise inside the case and heating up my poor little CPU. The CPU fan was then reacting by spinning like crazy.

Even after I restarted my Minecraft server, the sound didn’t come back. I am still surprised by this effect which I didn’t expect.

This 1Tb hard drive is definitely getting old and emitted some suspect sounds a few times. I am wondering if it would have failed and died if I left it in the machine. This SSD move thus saved me an unexpected reinstall and will help me have a better time with this HTPC.

So yes after all it was worth it!

Categories
Configuration

One SSD instead of two: simpler or not?

My Core i7 machine, named Drake, had two 120Gb SSD drives. I purchased the first one with the machine and put Windows 7 and Ubuntu on it. Then I needed more space to get Mac OS X, so I added a second 120Gb SSD. Mac OS X became a pain, almost unusable because everything was too small. When I reached the point I had to lower screen resolution to get Thunderbird running comfortably, I got rid of Mac OS X. Then Windows 7, upgraded to Windows 8, started to eat up more space so I needed to move Ubuntu to the second SSD.

I ended up with a brittle configuration composed of the ESP (EFI system partition) on the second SSD, Windows 8.1 on the first drive and Ubuntu on the second. I was waiting for a special deal on a 240Gb SSD and finally got one on TigerDirect at the beginning of September 2014. However, purchasing the SSD is only the easy part. Migrating data from two SSD drives to a single one, with Windows 8.1, Ubuntu 14.04 and UEFI in the way, is an incredible source of headache. This page shows how I got it done.

The easy way: reinstall everything

That would have worked ten, maybe even five years ago. Yes, just reinstall Windows, a few drivers, a few programs, put back Ubuntu, perform some settings, fine tune a bit, and enjoy the rebirth of the system, coming back to life and full functionality. Things changed with years, not for good. Now that Microsoft and other hardware manufacturers assume people won’t install by themselves and rather purchase hardware with everything preinstalled and preconfigured, things became more and more time consuming to setup. Just installing Windows 8 takes more than 45 minutes, and although I could obtain a DVD with Windows 8.1, my Windows 8 serial number won’t work with it. I would have had to install Windows 8, then upgrade to Windows 8.1 again!

Then come the drivers. Since I purchased my motherboard before Windows 8 was released, all my motherboard CD has to offer is Windows 7 drivers. So I cannot use the easy auto-install tool performing an unattended setup. I rather have to download every driver separately from Asus, run them, wait, reboot, run the next one, etc. Then there is the NVIDIA driver, requiring 100 Mb of download and yet another installation taking more than five minutes, and yet another reboot. Maybe I chose the wrong motherboard. By sacrificing a few USB ports, S/PDIF audio and maybe some PCI Express slots, maybe I could get something simpler not requiring as many drivers, that would be able to make use of what is prepackaged within Windows. That’s still to be investigated.

Then come the programs. Yes, Ninite can install me many programs automatically but not GNU Emacs, GNU GPG, it won’t configure my Dropbox, resync my Firefox bookmarks, reinitialize my Thunderbird email settings. It won’t link back my Documents, Images, Music and Videos default folders to my data hard drive.

And then come the licenses. How Windows 8.1 activation will behave? Will it happen smoothly, or will Windows decide that this change of SSD is too much and require me to call Microsoft to perform activation by phone, forcing me to exchange, by voice, on a poor channel, tens of nonsensical digits? After Windows 8.1 activation, my DAW, Live from Ableton, also requires authorization. I’m not sure it will reauthorize, since I activated it on my main PC as well as my ultrabook. That means additional hassle.

Bottom line, reinstalling is a pain, and that is just the Windows side. Ubuntu installation is usually smooth, but when a single thing goes bad, it requires hours of Google searches.

This is why I wanted a better way. I was so tired of this tedious process I was considering giving up on this machine and use my ultrabook instead, if data transfer failed. But my ultrabook, with its 128Gb SSD, won’t have enough storage for editing music made of samples or recording/editing Minecraft videos.

Preliminary connection of the new SSD

Before installing the new 240Gb SSD into my system permanently, I wanted to be sure I would be able to transfer my two operating systems (Windows 8.1 and Ubuntu 14.04) and make them boot. I thus only plugged the disk rather than attaching it right away into my case. I fortunately had some free SATA power cables as well as an extra SATA cable and port. That allowed me to connect the new drive without disconnecting the others. This way, it would have been easy to roll back in case of difficulties forcing me to reinstall everything, and then think about another strategy or gather my courage and patience for the full reinstall.

I then booted from a USB stick with a Live installation of Ubuntu 14.04. This was necessary to perform the data transfer on a totally offline, clean, file system.

Before transferring anything on the drive, I ran a SMART self test. For this, I installed smartmontools with apt-get and ran sudo smartctl -t long /dev/sdb. At this time, /dev/sdb was the device of the drive. That took almost an hour, but I could leave this running and do something else.

The self-test found no defects. I learned to do this preliminary step the hard way when I assembled a machine for my parents. The hard drive failed short while I was configuring Windows and I had to RMA it. Performing a self-test may have avoided me a waste of time and some frustration.

The drive being clean from any defect, at least from the point of view of the self test, I moved to the next step: data transfer.

GParted is the king!

A long time ago, my only friend for partitioning and drive transfer was Parition Magic, from PowerQuest, now purchased by Symantec. That time is over, thanks to GParted, a free open source tool that comes with Ubuntu. But that time, my job was pushing GParted to the limits. Here are the operations I needed to perform with it:

  1. Create a GUID Partition Table (GPT) on the new SSD. This is because I want a pure UEFI-based system. But this is not strictly necessary since the drive is far from the 2Tb limit!
  2. Copy the first partition of my second SSD at the beginning of the new drive: this is the ESP.
  3. Copy the first partition of the first SSD: this is the 128Mb system reserved partition of Windows. That copy wasn’t possible, because GParted didn’t know the partition type. I thus left a 128Mb hole declared as Unformatted, to figure out a way out later on. I was hoping Windows could recreate the data on this partition.
  4. Copy the second partition of the first SSD: this was the Windows main partition.
  5. Copy the 40-ish Gb partition of my second SSD at the end of the new drive: this was my home drive from Ubuntu.
  6. Copy the 20-ish Gb partition of my second SSD at the bottom of the free space on new drive: this was my main Ubuntu installation.
  7. Create an extra 20 Gb partition on the new drive in case I would like to give a shot to a new Linux distribution.
  8. Create a 16Gb swap space on the new drive for Ubuntu’s use.
  9. Resize my Windows main partition to take the rest of the space.

Phew!This long sequence gathering pieces from different sources reminds me of infusion crafting in the Thaumcraft mod of Minecraft, where essentias and items are combined together on an altar to craft powerful magical objects.

I hoped that sequence would work, but that failed at step 5. For no obvious reason, GParted wasn’t able to copy my Ubuntu home drive at the end of the new SSD! I had to leave an 8Mb gap and then resize the partition to fill it. I then performed, one by one, the other operations. That was a quite tedious job, because the mouse pointer was too small and impossible to enlarge without a system hack (Ubuntu bug since 11.10! They chose to remove the option to resize mouse pointer rather than fixing the issue.) and sometimes clicking was opening the menu and closing it right away rather than leaving it open.

Following image gives the final layout. Isn’t that great? Not sure at all this is simpler with one drive than with two, after all…

gparted

After this transfer process, I tried to recreate the entries in my UEFI’s NVRAM, using efibootmgr, for Windows and Ubuntu. I then unplugged the SATA cables of my two 120Gb SSD drives from my motherboard and rebooted the PC. I won’t state the exact commands I used here, because that just failed. System wasn’t booting at all.

Fixing Ubuntu

Back to my Ubuntu live USB, after at least five attempts because my motherboard is apparently defective and misses the F8 key from time to time and the need to jump into Setup and change the boot order from there to boot the UEFI USB stick. Boot time with that Asus board is desperately long. Waiting 15 to 20 seconds from power up to boot loader is a shame when knowing it takes less than 1 second on a 300$ laptop! But the laptop lacks storage expandability I need, so I am always stuck on one end or another.

Then comes the fun part. I am pretty surprised there is no easier ways to restore GRUB than the following. I read about boot-repair, but it is just missing, probably yet another PPA to copy/paste and install. Anyway, I ended up getting it to work.

First I found the partition where Ubuntu was installed, /dev/sda5, and mounted it: sudo mkdir /media/ubuntu && sudo mount -t ext4 /dev/sda5 /media/ubuntu. I did the same with my ESP: sudo mkdir /media/efi && sudo mount -t vfat /dev/sda1 /media/efi.

Second step was to establish bindings:

sudo mount –rbind /dev /media/ubuntu/dev
sudo mount –rbind /proc /media/ubuntu/proc
sudo mount –rbind /sys /media/ubuntu/sys
sudo mount –rbind /media/efi /media/ubuntu/boot/efi

That caused some directories inside my Ubuntu mount to mirror exactly the top level directories.

Then I had to chroot into my Ubuntu, using

sudo chroot /media/ubuntu

After all this, system was behaving a bit the same way as if I started a shell on my Ubuntu setup. From this, I tried

sudo upgrade-grub2

That just updated GRUB’s entries, not the EFI one, so didn’t fix the boot.

Then I tried

sudo grub-install

If I remember well, no arguments were necessary, and that fixed my GRUB EFI and added back the Ubuntu entry to NVRAM. This worked only after /boot/efi was correctly referring to my ESP. Note however that for this to work fully, the USB live Ubuntu had to be booted in UEFI mode, not MBR default mode.

A reboot later, I was starting my Ubuntu setup, fully intact and working! Half of the transfer done! Not quite…

Windows was failing to boot and Ubuntu’s update-grub wasn’t detecting Windows anymore. Quite bad.

Windows: desperately dead

Windows, on the other hand, wasn’t booting at all. It was showing a blue screen suggesting me to use the repair tools from Windows DVD. Last time I did this, the tools ran for at least one minute and bailed out, so I had to do a complete refresh which ended up wiping everything and leaving only applications from the Windows store. If I have to choose between such a messed-up repair and a clean install, I would bet for the second option.

Before entering into this reinstall nightmare once again, I tried to recover the reserved partition. For this, I plugged back my Windows 120Gb SSD and booted from my live USB stick to make sure Windows would not kick in and see two copies of itself (one on the old, one on the new SSD). If Windows sees two copies of itself, it changes the disk ID of one copy. If the new drive is changed, everything is messed up and Windows cannot boot anymore, until a refresh is done (and then everything is messed up again!). Back to my live USB, I used DD to transfer the bytes of the old reserved partition to the new one. I also made sure the new /dev/sda2 reserved partition was marked as such in GParted, by modifying the flags. That changed nothing.

The post How to repair the EFI Bootloader in Windows 8 literally saved me hours of work! This gives a procedure that allows to fix the boot loader. The main idea is to log into console from Windows DVD and run bootrec /fixboot command from directory EFI\Microsoft\Boot\ of the ESP, followed by bcdboot  with a couple of arguments, again from the ESP. Luckily, I had my ultrabook, which was quite handy to check the page while I was running the commands on my primary PC.

That solved the issue and allowed me to boot into Windows 8.1! PHEW! Quite a nice step forward.

GRUB not detecting Windows

Now that my machine was able to boot into both Windows and Linux, one could wonder what was missing. Well, I had no easy way to choose which operating system to boot at startup. Originally, GRUB was offering me an option to boot into Windows or Ubuntu. After the transfer, it was only seeing Ubuntu.

I found procedures to manually add an entry for Windows but that involved finding and copy/pasting drive UUID and probably redoing the change on each kernel update. I didn’t want that. Another possibility was to install an alternative EFI boot loader like rEFInd, but these have tendency to display many unwanted icons doing nothing. I got enough trouble with this while fiddling with triple boot (Windows, Linux, Mac OS X).

There was absolutely no way out. People were doing the manual addition of Windows or that was working out of the box. I had to spend more than 45 minutes inspecting the os-prober script and walking through it! By looking at the script and its logs in /var/log/syslog, I manage to find out it was skipping my ESP because the partition was not flagged as Boot! I fixed that from GParted, reran sudo update-grub and tada! GRUB was seeing Windows!

This is NOT the end!

Then I had to proceed with the hardware installation of the new drive. Since I was too impatient to get a SSD, I ended up with an ill-designed system. If I had waited another year before purchasing my Core i7 PC, I would have got a superb case with support for SSD drives. Now I have a CoolerMaster kind of case with only standard 3.5″ drive bays and need to fiddle with SSD brackets. Screwing the SSD drive in this is a painful process of trial and error. Then the assembly doesn’t fit well with the screwless mechanism of the case. This somewhat holds in place, but that’s not smooth installation like a regular 3.5″ drive.

Some more fiddling later, my new SSD was plugged back into my PSU and motherboard, and I got rid of the extra two SATA cables. I stored them away; they will be useful sooner than later, because my two 120Gb SSD won’t remain unused.

I plan to put one of them into my HTPC, which will be another adventure of its own. My HTPC has only four SATA ports, all used up, so I will have to get rid of one hard drive.