The Leopard dead end

Finally got it, trapped myself into a dead end. This afternoon, I wanted to get AdBlock Pus back into my own account’s Google Chrome on the Mac. I found out that extensions wouldn’t install unless I update my Google Chrome. I tried it, but that stupidly installed a version of Chrome incompatible with Mac OS X 10.5 and I just cannot revert back to the old Chrome. It cannot be downloaded anymore from Google and apparently anywhere else. So I end up with only Safari as a browser on this machine. I cannot get Chrome anymore, no matter how hard I try. However, one hour ago, after I put away this damned machine, I found out I was able to get Firefox 16, the last Firefox usable on Mac OS X 10.5. This wouldn’t get me access to Sync for my bookmarks and history (now requires Firefox 33 or something), but at least I would get a browser I’m used to and that supports HTML5.

Well, at worst, I can live with Safari as the only browser, but getting blocked to Mac OS X 10.5.8 is a more and more an handicap. It prevents me from getting Firefox with Sync, Chrome, SmoothMouse or Razer Synapse, I don’t have access to the Mac App Store so cannot get back my screen capture utility I started using on my Hackintosh, and cannot obtain XCode, which I would need to obtain MEncoder through MacPorts. MEncoder would be useful to encode videos into the QuickTime variant Final Cut Express prefers for optimum performance, as soon as I find this out, and assuming there is such an optimal format that would avoid lengty rendering or at least reduce its length.

The golden upgrade path: from scratch to Yosemite!

I was imagining an upgrade path capable of leading me to Yosemite. I was over-enthousiast with this, thrilled by the possibility it may be doable. The idea would be the following:

  1. Backup as much as I can the data from the hard drive.
  2. Optionnally replace the hard drive with a 120Gb SSD I don’t use anymore.
  3. Install, from scratch, the Mountain Lion I got during my Hackintosh venture.  The USB key Unibeast produced me is just a wrapped up prestine Mac OS X installer that can boot on MBR-based PCs. The Mac’s EFI will just ignore that part, find the EFI loader from Apple and boot that, bypassing the hacked part of the media. I was thus hopeful to get a fresh Mountain Lion system installed from my hacked USB stick! After all, I paid for Mountain Lion, so it was nice to be able to get a return over investment.
  4. Upgrade to Mavericks through Mac App Store, maybe even directly to Yosemite.
  5. Update to Yosemite through Mac App Store if I had to go through Mavericks.
  6. Find a way to get back Final Cut Express, maybe I will be able to copy it back from backed up hard drive, maybe not.
  7. Enjoy!

Unfortunately, things didn’t go so great. First, installation of a SSD would be a great undertaking, requiring almost tearing apart the whole laptop. Then I found out, by running uname -a from a terminal, that Apple did the same stupid mistake as all other manufacturers until Windows 7 comes out: supply a 32 bits OS on machines with a CPU capable of 64 bits operations!

But Apple went one step further: hard-code the EFI to boot only 32 bits loaders!!! There is apparently no work around. A PC with a 32 bit EFI could be worked around by using the legacy boot, which would allow an OS with 80×86 code to switch the CPU into 64-bits mode. But Mac’s EFI can only boot EFI executables, and EFI executables are incapable of switching the CPU mode, at least as far as I know. I didn’t get confirmation of this from EFI spec, so I may be surprised later on. So if EFI is 32 bits on a Mac, that Mac an only run 32 bits OSes!

Why is it so bad? Well, Mountain Lion and later versions only supply a 64-bits kernel, so they won’t run on a 32-bits EFI, no chance it can happen. So not only I am blocked to Mac OS X 10.7, but I don’t have the needed media to jump there.

This afternoon, I confirmed this theory by attempting to boot the Mac off my hacked Mountain Lion USB stick. Although the EFI displayed the loader from the USB stick, that simply resulted in a forbidden sign, no more.

From Leopard to Lion

Without an access to Mac App Store, getting my hands on Lion will be as hard as getting Mountain Lion. I would have to install an hacked version of Mac OS X into VirtualBox or rebuild my Hackintosh, use Mac App Store to download Lion (assuming it would let me do so without meeting requirements) and transfer that onto a USB stick or DVD.

Any Google search leads me to the need for purchasing the retail DVD of Snow Leopard from Apple Store. That would require ordering the disk online, waiting for it forever and it will be delivered to me while I am at work (unless I spend the whole week working from home, and maybe even then). Ideally, I would purchase a download link to an ISO image and burn it.

However, things are not as simple. This afternoon, pissed off beyond imagination, I tried to search for a Torrent that would allow me to grab that damned ISO. It found several, but I discovered that the ISO file was 7Gb large, so doesn’t fit a DVD. So even if I could download the ISO legally from Apple, I wouldn’t be able to burn it on a DVD. Somebody managed to unpack the ISO on an iPod Classic and installed Mac OS X from that through a Firewire port. Wow! I don’t have such an artillery, nor the iPod classic, nor the Firewire cable. Would I get some luck with a USB stick or external hard drive? Maybe, maybe not.

I kind of lost hope at this point. It seems that a Mac either works out of the box as it is, or requires divine intervention from an Apple technician. But getting an Apple technician to work on the machine is another headache, requiring scheduled appointment, probably during day time, so I would have to interrupt my work day to get that damned computer checked, and probably the tech, when looking at the empty battery slot, would stop there and simply recommend I purchase a brand new Macbook Pro, that would have not just Snow Leopard but prestine all-new super-great super-cool Yosemite. Yeah! But that would cost me more than 2000$, and still no more Final Cut Express.

I don’t know what to do at this point. I’m oscillating between switching gears and reinstalling my Hackintosh, ordering this damned Snow Leopard DVD and see or getting it (illegally, unfortunately) through BitTorrent and fiddling with external media workarounds to accomodate the oversized media. If I can get my hands on a couple of dual layer DVD+-R medias, maybe my burner would be able to write the large ISO on that, maybe not, I never tried.

I finally decided to try ordering the retail DVD. I tried upgrading to expedited shipping to make sure to get it a day I know I would be at home. I also found out that upgrading to Lion would require bumping up memory to 2Gb, so I will have to figure out how to replace the SO-DIMM modules before going from Snow Leopard to Lion.

Note that I also found out that the Mac has a 120Gb hard drive. As a result, if I later on switch to use my 120Gb OCZ Agility 3 SSD, I will have the same amount of disk space, with more speed.

Can this Mac be saved?

If I cannot get a more supported Mac OS X, can this machine be freed from this limiting plague which seems to be Mac OS X? The best thing I could install it is probably a Ubuntu version of some sort. But searches about this are not so positive. Some people succeeeded, but they used an hacked Ubuntu version. Some people are telling it is possible to install a 32-bits version of Ubuntu 14.04 from a DVD or USB stick (EFI could boot the live CD), some didn’t have success unless they used an hacked 12.04 DVD and jumped to 14.04. So this promises to be a very frustrating weekend (and maybe even whole Christmas holidays) of trial and errors! I’m not sure I want to venture into that.


External display issues

Quickly enough, I wanted to plug the Mac on an external display. First, I would get a bigger screen. Then I would be able to use a fully-functional keyboard with working F1-F10, Alt/Option and Escape keys, in front of my external monitor, rather than having to lean my head constantly towards the far-away laptop, to control it with an external keyboard/mouse.

Physical connection

I quickly found out the good new: that October 2006 MacBook Pro (discovered exact model using MacTracker this morning) has a real full-sized DVI output port! This is not the pseky mini-DVI requiring almost-Apple-specific adapters, even not the newest mini-DisplayPort. No, a real DVI port! Ok, I have a DVI to HDMI adapter for that!


Well I realized with surprise that the DVI end of this adapter didn’t fit into the DVI port of the Mac. The exact reason why is still to be determined. However, I tried with a DVI to HDMI cable I had and the DVI end fitted. The DVI end of a DVI to DVI cable also fitted.

I was thus able to hook up the Macbook Pro to my Dell 23″ touch screen through DVI->HDMI. But the touch interface, provided by a separate USB connection, didn’t work, only the keyboard and mouse worked through the USB hub built into the display. I also successfully hooked up the Mac to my old 22″ LG LCD. Ideally, I would have put the machine and old LCD on a dedicated desk/table, with its own keyboard and mouse (ideally an Apple keyboard, if I can get my hands on one), but I don’t have a large enough table for this to fit comfortably.

The problem with my new Dell LCD is that connecting the cables is hard. The space behind is kind of too tight. I thus try to leave cables connected in the monitor’s end and just plug/unplug the device ends. There is one loose HDMI cable hanging on my desk. I can plug something directly if it has an HDMI output, or use an adapter to turn this HDMI into a mini-DisplayPort (really, I purchased this adapter for a Dell ultrabook provided by Nuance, the company where I am working), a DVI, a mini-HDMI or micro-HDMI.

The fact that the HDMI to DVI adapter didn’t work was kind of problematic for my monitor setup. I thought about several solutions:

  • An HDMI male to female cable. Most cables are male to male. It is a bit uncommon to have one female end.
  • A female to female HDMI coupler that would allow me to link to cables with an HDMI male end together.
  • An HDMI to DisplayPort cable. My Dell LCD has a DisplayPort port which is unused. I could try to use it instead of the HDMI port for connection with HDMI and mini-DisplayPort laptops. But I may have to purchase an additional mini-DisplayPort to DisplayPort adapter, not sure it would work with mini-DisplayPort/HDMI/DisplayPort.
  • A DVI to DisplayPort cable. May work, but I end up having to try my luck with another DVI end which may not fit into the Mac’s port, unless of course I purchase the damned cable twice the price at an Apple store!
  • A new LCD with easier to access connectors. I may have to sacrifice the touch ability, which is totally unacceptable!

Yesterday afternoon, I tried to get the HDMI male to female cable from a local computer store. I thought I had it, but looking further at it, I found out it was a somewhat weird HDMI to VGA cable. The ends are shown on the picture below.


I was quite depressed and exasperated when I noticed that. How wiill I be able to get my hands on the needed cable if I cannot rely on anybody to help me out with finding it. I will have to wait forever for somebody sighted and versed into computer science to come with me and check, or try ordering the thing online.

But wait. I do have what I need!!! On my way to the gym yesterday, I remembered about the HDMI switch I was using with my old LCD monitor to duplicate the digital inputs (it has just one DVI and one VGA port). This is exactly the HDMI female to female coupler I needed, with just extra HDMI inputs I just don’t need for this particular uncommon use case. Back at home yesterday, I did the connection and it worked!


Lack of flexibility

Quickly enough, I found out that Mac OS X would not display 1920×1080 to my LCD. It was matching with the internal display’s resolution. I went into the Monitors preferences and bumped up the resolution, but Mac OS X stubbornly added black vertical bars, truncating the image. This looks similar to picture below.


I searched and found no solution, except closing the lid. Argh! That will prevent me from accessing the internal keyboard, if I need to change sound volume and the machine may become hot. Ok, let’s do it.

But that didn’t help! The laptop, instead of shutting off its internal display and sending video just to external, went into suspend mode! It was hooked up to AC power. It needs to be, because the battery exploded a while ago (before my brother’s girlfriend gave me the machine), so there is no battery at all!


The first time I did this, after almost thirty seconds, my Dell LCD turned back on and I had 1080p.


However, fonts were so tiny that I almost went mentally ill! Moreover, after that, internal display wouldn’t turn on again.

I don’t know exactly why, but I had to reboot the Mac later on, which reverted to the two-screen mode. Then closing the lid just suspended the Mac, not turning the external display anymore. However, it was a bit easier to work with the two displays, because the truncated display gave a bit larger fonts than fully 1080p display.

At some point, I reached the dead end with font size and finally lowered the resolution, but that didn’t help much. It just truncates the menu bar and things are not really bigger.

How about the “extended” display mode? Hitting F7 toggles between mirrored and extended modes. In mirrored mode, the two displays show the same thing. In extended mode, they act like a large screen. Extended mode didn’t work well for me, because the menu bar only showed up on the internal laptop LCD and each time I was trying to bring my lost mouse pointer to the top most corner, it was disappearing into the internal display!

The mystery of the closed lid

I think I found the way to have only external display. After the laptop suspended when lid is closed, I should have move the mouse or pressed a key on the keyboard, which would wake up the machine and force it to use just the external display. I read that somewhere during never-ending searches about other issues.

Getting back to internal display is a matter of disconnecting the external display, closing the lid and reopening it.


Mouse issues

When I started exploring Final Cut Express on the Mac, I got quickly blocked by the mouse driving me totally crazy. With the Mac mouse coming with the machine, it was somewhat working, but the pointer moved very slowly. I had to move the mouse, lift it, move it again, lift it, five or six times, to bring the pointer where I wanted. This quickly became a real pain. The trackpad is working a bit better, but just a bit.

I tried with my PC Razer mouse in the hope I would get better results. The mouse worked, as opposed to the noname pointing device I tried on my Hackintosh last year, but the pointer was moving desperately slowly. I tried to tweak the pointer speed (there are buttons for that on my Razer mouse), but that made things worse! The pointer was moving slowly if I moved the mouse a bit, then jumping at a somewhat random place on the screen! It was almost impossible for me to use that!

Razer Synapse making my synapses mad

In an attempt to solve this, I tried installing the driver from Razer, Synapse 2.0. There is a Mac version. Installation went well, but after system restarted, I got an error message popping up about RzUpdater that crashed. I had to choice to Ignore, Report or Relaunch. Tired of unstable software, I didn’t hesitate to try the Report option. That sent a (probably useless) report to Apple, the dialog box dismissed, but it almost instantly reappared. I tried to dismiss with Ignore, with Relaunch, to no avail.

The message was always on top, covering other windows. I had to move it away at the bottom of the screen. It was always hanging, impossible to get rid of completely.

Then started a more than 45 minute Google search giving almost nothing except frustration. I got referred to an uninstaller program, found it using Spotlight, ran it, but it said to be incompatible with my version of Mac OS X. After a few searches and attempts, I found out that RzUpdater and RzEngine wouldn’t start on my version of Mac OS X, and moreover, uninstaller wouldn’t start as well! So I was stuck with no solution.

I then searched about cleaning up the startup applications and found this. This gave me something that allowed to “repair” my system. The /Library/StartupItems and /System/Library/StartupItems folders were empty, which is perfectly normal, /Library/LaunchDaemons folder didn’t contain any Razer-related stuff. But /Library/LauchAgents contained two Razer-related files, one about RzUpdater, one about RzEngine. Resisting the temptation of removing these files without any precaution, I made a backup copy, then I proceeded with the removal.

I don’t know if and how this can be done from the Finder. I was tired of the GUI and opened a Terminal to do the surgery from the Bash shell I know about and like far better than a GUI not working well with the keyboard. A plain old rm didn’t do the trick: permission denied. But sudo rm worked; I had to provide the login password of the account I was logged on. The damned RzUpdater message came back after I dismissed it for the maybe 50th time, but one reboot later, it was gone for good.

SmoothMouse not so smooth

After this miserable failure with Razer Synapse (almost bloated the Mac forever and drove me nut!), I tried more searches and found a free solution. My new hope: SmoothMouse! Happy to find something promising, I tried to download and install this. Unfortunately, that didn’t start at all: the tool was incompatible with Mac OS X 10.5, works only with 10.6.8 or above.

I could try with SteerMouse, ControllerMate or USB Overdrive, but all of these three cost 20$ to 30$ US. Without the Mac App Store (only in 10.6, again!), I would have to stick my credit card yet another place and be charged an undetermined amount of money since I am unlucky and live in Canada, not in US. I am more and more frustrated by all these artificial complications. Mouse support should be built into the OS, not require a third party tool that needs to be purchased separately. Moreover, I feel that no matter which one I’ll pick, I’ll get a road block later on and will have to switch. If I was sure that one of the three tools would solve my mouse problem once and for all, and no matter what future mouse I stick into the USB port of this apparently dommed MacBook Pro, I would be glad to pay 20$ to a person that would have saved me!

An unexpected improvement

Feeling more and more likely to end up giving this machine back to my brother’s girlfriend, I decided to isolate my connections to personal profiles (Google, Facebook, Apple ID) in a dedicated account, rather than continuing to log in with her account.  The process of creating a new account was easy and worked flawlessly. I then logged in with my new account and found out that the mouse was working a bit better. While not perfect, it is usable, so this is less of an issue than before. Moreover, the creation of the new account allowed me to start fresh with a cleaner, less cluterred dock.

However, I am finding more and more programs that cannot install on obsolete, not supported anymore, Mac OS X 10.5. This is starting to be a road block, near a show stopper. This may end up my adventure prematurely, unless I switch gear and come back to my early Hackintosh-based installation!


Defective keys and erratic keyboard shortcuts

One of the first thing I attempted to do on the Mac was to turn on full zoom and enlarge mouse pointer. This can be done from the System Preferences (Apple menu), Universal Access icon. Zoom was set to No, so I changed it to Yes.

Image 5

As a side note, the Mouse and trackpad tab offers a neat way to enlarge the mouse pointer. This is one of the greatest Mac OS X feature.

Image 6

The Sight tab is giving the keyboard shortcut to zoom and unzoom, unfortunately in a quite cryptic and hard to remember way. Instead of putting key names such as Command, Option, Shift, etc., Apple decided to use icons that make no sense. This is a real pain for me, because I have trouble associating these images with keys. But after some time, I figured out that the key combination to zoom was Command (key marked with an apple just left to the space bar), Option (key just left to the Command key) and =. However, hitting that key combination just did nothing. I tried several times, without success. I also tried Command-Option-8 many times to make sure that zoom was effectively enabled.

Starting to suspect the key wasn’t responding physically, I started a Terminal and launched xev, an utility I know of from my UNIX/Linux background. xev listens to events and displays the name X receives when the event occurs. This is a way to figure out what is generated when a key is pressed, a mouse button is used, etc. Pressing on Command, Shift, =, and some other keys, had an effet. However, pressing on Option did… nothing.

Ok, maybe some misbehaved application was intercepting the Option keys. To iron this out, I wanted to boot this machine off an Ubuntu USB stick. The main idea to achieve that is to insert the stick and press Option at boot, just after hearing the chime. No matter how much times I tried this, it had no effect.

I finally tested with an external USB keyboard, but that was a PC keyboard, since I don’t have a spare Mac USB keyboard hanging around in my house. This is not ideal, because some keys are misplaced and quite confusing. In particular, to get the Command key on the PC keyboard, I have to press the Windows key. Option is mapped to Alt. This knowledge in mind, acquired from my experience with a Hackintosh, I tried Windows-Alt-= and got the zoom! Ok, so Mac OS X is processing Option key, but the builtin keyboard is not capable of producing it.

The only way I can zoom in with just the built in keyboard is by hitting Shift and scrolling with two fingers on the trackpad.

Note that I successfully got the EFI boot menu by turning on the Mac while hooked to an external keyboard and hitting Alt key after the chime. However, I had to repeat the manipulation five or six times until it succeeded. That remembered me of the memorable trouble I got while trying to jailbreak my iPod after upgrade to iOS 4; entering DFU was hard and no instruction given on the Internet was working. I had to use completely different timings than one given on web sites! Seems Apple likes these hard to guess procedures, to be alone able to debug things, but that excludes cases of old out-of-warranty devices!

But that’s not the end of the story. Quick enough, I wanted to open up the main menu bar without using the mouse. I know there is a way: CTRL-F1 to enable universal keyboard access, then CTRL-F2. This again comes from my Hackintosh experience. So I tried… with no luck.

After a little trial and error, I found out that F1 through F10 don’t produce F1 through F10 but rather special behavior, like adjusting LCD brightness, volume, keyboard backlight, etc. Correct, no problem with that, if I can get F1 through F10 another way. From previous intuitions with laptops, I figured out that fn key combined with F2 would do the trick, so Ctrl-Fn-F2 would open up the menu bar. Well no! Fn key seems defective and not responding as well, same for Escape!

Any Google search about this got me absolutely no positive results. It seems I would have to replace the keyboard, which would involve purchasing a completely new chassis, disassembling the MacBook Pro and reassembling it in the new chassis. This is just a non-sense and too costly job. I’m not ready, nor technically, nor mentally, to engage into such an operation. I could do it on a desktop PC if I had to, because I know how they are put together, but I have no knowledge, no mental model, about how the components of MacBook Pro are put together.

Lack of the fn key also prevents the production of the Home, End, Page Up and Page Down keys by combining Fn with the arrow keys. I will probably discover other impossible to obtain keys. For now, it is far better to hook up an external USB keyboard to this machine, especially to be in front on a larger external monitor when using it.

Even with an external USB keyboard, some shortcuts don’t always respond. For example, Ctrl-F2 doesn’t always open the menu. It seems that when the machine is busy, it just happily skips the interpretation of the shortcut! Sometimes, the machine produces an annoying beep when I hit Ctrl-F2 instead of popping up the menu. Combined with other issues, this sometimes got me mad, almost drove me crazy!

The use of a PC keyboard complicates things. To get a /, I have to combine right Alt with é key. Same right alt acrobatics are needed for ~, {, }, [, ], |, etc. The ù is obtained by the key just left to the number 1 while on the MacBook Pro keyboard, it is left to the Z key.

Some keyboard shortcuts I am used to just don’t work: Windows-Tab instead of Alt-Tab, Windows-L instead of Ctrl-L, etc. The Home key, rather than going at the beginning of the current line, jumps at the beginning of the file, forcing me to return at the point I was while editing. This happens for documents in text processors as well as text edited on web interface such as WordPress. This is minor compared to problems caused by defective keys or shortcuts not always responding, but this adds up to make a really bad and extremely frustrating user experience.




Screen becoming black sporadically

I often have trouble tracking the mouse pointer on the screen, especially the default small one. When I loose track of it, I bring it back at the top left corner of the screen. The first time I did that on the Mac, everything became black. Even if I was moving the mouse pointer or hitting a key, nothing was happening. Screen reappared after a few seconds. This happened a few times, but the other times, it was possible to leave this mode by just moving the mouse.

I figured out two things. First, the screen would become black every time I bring the mouse pointer at the top left corner. Second, the machine was so overloaded that at start, things didn’t respond properly, even exiting from this black mode.

I looked into the system preferences for a solution about this. The preferences are accessible from the Apple menu, available from all applications.

Image 2

I found something quite interesting in preferences for Exposé and spaces: configurable hot corners.

Image 4

Top left corner was configured to trigger the screen saver! AH! I clicked on this and that popped a menu offering me different options, including one to disable this unwanted (at least for me) behavior.

Image 3

In conclusion, not a bug, just a feature. At least, this is configurable, as opposed to intrusive GNOME3 hot corners which cannot be tweaked from any GUI, at least in the versions I tried.


An old Mac = old and new problems

This is the first post of what I expect to be a very long saga. I don’t know exactly what I will acquire during this venture, but I am tempted to engage in it because of my curiosity and taste for exploration.  Will I get more patience, experience with Final Cut Express, with Mac OS X, maybe, maybe not. For now, the only thing I am getting is trouble. It seems that nothing works as expected. Keyboard shortcuts randomly fail, mouse moves erratically, scrolling in web pages is slow and choppy and every, I say EVERY, web page shows up with a tiny font. I have to hit Command-+ more than 15 times to get a zoom level I am comfortable with.

This Mac comes from my brother’s girlfriend. Since she is nicknamed Poney, I decided to call this machine the PoneyMac for the time being. When I consider this beast as tamed, it will symbocally become mine and I may assign it a new name of my own creation, but for now, this is The Mac, not my Mac.

This is a 2006 MacBook Pro with 1Gb of RAM and an hard drive of I don’t know yet exact size.  The machine runs Mac OS X 10.5 Leopard, unfortunately a 32 bits version. I’m not yet sure if the EFI is capable of booting 64 bits OSes, maybe not, which is quite bad.

All future posts of this saga will get the PoneyMac category, with tags corresponding to the topic.


Another upgrade that breaks things down

When I tried to write a WordPress post today, I found out that I couldn’t enter anything in the main text edit area. I tried several times, saved the draft without any success and then found out I would be blocked, unless I AGAIN try with a different browser. I just cannot continue like this if now I have to leave several browsers open and switch from one to the other. This just breaks all possibilities of efficiently switching between windows. I would end up with different similar-looking windows, with no way to quickly distinguish them, as opposed to one browser window with tabs.

A Google search about this lead me to a forum post. Some people were experiencing similar issues, again after a WordPress upgrade! Some people manually reinstalled an old version without success, but god, I cannot manually install things on my dumb HostPapa account, because I have NO SSH access! I am just exhausted of facing the same problems I cannot resolve unless I completely switch gears, reformat, start over, etc.

A forum post suggested to disable all extensions, clear the browser cache and switch to the default theme. I tried the cache: no success. I never changed the theme, so I am probably using the default. I then had to disable extensions. The culprit was TablePress.

But the fundamental question is: what’s the point of having any extension if they constantly need to be turned off for one reason or another?

I’m now stuck with an unstable content management system whose extensions are broken and there is no way to improve over this without manually moving everything to something else.

Bug Configuration

Cannot transfer files anymore from my Galaxy Nexus through USB

Last friday, I tried to hook up my Galaxy Nexus phone through USB to transfer some files to my computer. After a few seconds, a completely empty Nautilus window appeared. Once again, Ubuntu was incapable of detecting my device. This happened a few versions ago and I had to use obscure and impossible to remember MTP commands to perform transfers. I didn’t want to search for these commands again, tired of loosing more and more time with artificial problems. If Ubuntu degrades at this point from one version to the other, I will be better off switching to Windows and install Linux only in virtual machines. This is not the only issue I got and most bugs (mouse pointer, Emacs, M-Audio sound stability) persist with version upgrades. Canonical now seems to focus on Mir and newer Unity versions, which I really dislike because Mir will break everything down for five or six versions. Either this, or Canonical will cut corner on keyboard accessibility, resulting in a UI that will be almost unusable for me. I exptect this will  be my hardest Linux time ever when that beast comes out, until it stabilizes.

However, my actual bug was worse than this: the phone didn’t connect through Windows as well! When I plugged in my phone on a Windows machine, an empty Explorer window similar to below comes up and nothing else. Does that mean my phone is dying, progressively loosing functionalities? Probably.

Capture d'écran 2014-11-30 09.04.15

I didn’t atempt any Google search about this. It is worthless. I will find forum posts about people replacing the cable, doing factory resets, sending their phone for repair or replacement, etc. The phone hooks up, something is detected by the computer, so why a new USB cable would help? Yes, I can do factory resets after factory resets, that will probably fix it, but what’s the point of doing this if I know I will have to redo it a few months later, for no reason, unless I install NOTHING on the phone? And how I am getting through technical support are no-go solutions or offers for a new phone that will force me to switch to a more expensive plan with my provider or buy something from nowhere with an old version of Android.

Before accepting this conclusion and starting to look if I could get a Google Nexus phone from Google rather than going through Fido and affecting my phone/data plan or get something somewhat correct from, I tried with a different cable: same effect. Then I saw the home screen on the device and remembered I set up a PIN recently to protect my Google account for any tampering by somebody who may get back my phone if I loose it, or steal it from me.


I entered my PIN and saw with surprise and relief the following window on my computer:

Capture d'écran 2014-11-30 09.05.34

Tada! This time, the solution was simple! This was just a normal data protection! So my USB connection is still working!

Note that locking the phone doesn’t shut off the USB access, until the cable is disconnected. The PIN also doesn’t prevent me from answering a call, so this is not as problematic as I feared it would be.


Firefox Sync duplicating bookmarks

A few months ago, there was an update to Firefox Sync. The system was completely changed, requiring the creation of a new account and the reupload of all bookmarks, history information, etc. The old system was still functional, but it wasn’t possible anymore to add devices.

Despite my frustration (yet another account to create), I registered with the new system and started removing my other instances of Firefox from the old system, to attach them to the new. I have a couple of such instances: one on the Windows side of my personal computer, one on its Ubuntu side, one on my personal ultrabook, one on my HTPC, another one on my company’s laptop, another instance on my company’s ultrabook, and three on different virtual machines! That seemed to go pretty well, just uselessly time consuming.

However, a few days later, I noticed it became almost impossible to find something in my bookmarks. Some bookmarks disappeared, but looking a bit deeper, I found multiple copies of bookmarks as well as folders of bookmarks. This was a total mess. It would have taken me hours and hours to clean this up. I though about exporting the mess to JSON and writing myself a script to clean it up and restore with the processed copy, but I wasn’t feeling at working on this during my spare time. I thus left this alone and almost stopped using bookmarks. Sometimes, typing text in the URL finds stuff on Google, in bookmarks or in history. Sometimes, I was getting the link from within an email. Other times, I had to open up the bookmark manager and search endlessly.

Last week, I decide to try something about this: simply restore a backup copy of my bookmarks. I thought about an old copy I had done in January after a clean up of the bookmarks, then I found that Firefox was proposing me options to restore bookmarks saved periodically. I took the latest day with 400k of bookmarks vs 700k, and that did the trick!

Unfortunately, a week later (yesterday in fact), I found out that duplication started again! So it seems that Firefox Sync now simply makes one copy of the bookmarks per instance of Firefox it syncs with! Why not, in this case, offer these copies in separate folders, so at least one would know what to expect?

I searched for a long time to find a better way to manage my bookmarks, and everyone is adopting its own inconvenient or outdated solution. Some are using iCloud with Safari to sync bookmarks and manually transferring to Firefox, others are proposing Xmarks which was discontinued years ago while its web page still offers the tool (last time I tried it, it couldn’t sync, just sit there and try to connect to a server), others are adopting EverSync, others swear that Delicious is the best, etc. It seems I would have to choose one of these and be prepared to start over research a few months later and find the one that would replace my choice which would go down again.

I tried to restore backup once more. Maybe there is a problem with my SQLite Firefox DB on Windows 8.1; this Windows 8 box is more and more flooded with crap and would require a reformat/downgrade to Windows 7, which I just don’t want to do. Maybe, I thought, if I restore the backup on the Windows 8 box, the DB will be clean, and that would sync up with the rest. I tried, then went on my HTPC to see if things would fix. No result. I found out that my HTPC Firefox was still using the old sync, so I updated it.

Then duplicate bookmarks came back again! I got fed up and removed all my bookmarks. There is no point in having bookmarks if Firefox Sync copies it once per machine it connects to, and I just cannot get rid of all my computers except one, otherwise that will have to be my company’s laptop and that will end any possibility of playing Minecraft, attempting to compose music with Ableton Live, and I don’t feel comfortable leaving a copy of my personal data, including my diary, on my company’s laptop! And I would have to do this just because of Firefox? No!!!

I was about to switch to another strategy consisting of using Evernote to store links. There is a capture tool, the Clipper, that allows to make a copy of a web page, with its original link, so that can act as a kind of bookmarking system. This is at least better than the poor man’s system I was considering more and more, consisting of writing a plain old HTML page with links, uploading it to my HostPapa account and updating it from time to time.

But today, I found out that multiple Firefox instances got the same name. My Windows and Ubuntu installation share the same host name since they don’t run simultaneously, and I am using the same login name, so Firefox gave the same default name to the computers. This may explain why it got mixed up with bookmarks! Moreover, update from old to new system may result in additional bookmark duplication.

With this hypothesis in mind, I started all my Firefox instances, one by one, and verified that they all have different names, and that the removal of bookmarks correctly spread. At this point, all my Firefox instances got their bookmarks removed, but I need to be extra sure there is no left over instance I forgot that could restart the duplication like a virus. While doing that, I felt a bit like Jean-Luc Picard going from one version of the Enterprise to the other, in different time periods, to repair a temporal anomaly. Of course, my bookmark problem is far less dangerous than this!

I don’t know yet if that will allow me to put back my bookmarks in one instance of Firefox and see them reappear elsewhere, without duplication.


One SSD for my HTPC

A bit more than a month ago, I successfully transferred my dual boot Windows 8.1 and Ubuntu 14.04 from two 120Gb Solid State Drives (SSD) to a single 240Gb drive.  I got several problems restoring the bootloaders of the two operating systems, thought many times I would have to reinstall, then figured out a way to make them boot.

But what happened to the two drives I removed from my main computer? Well, they sat still on the top of a shelf. But at least one drive will be repurposed: become part of A.R.D.-NAS, my HTPC. Sunday, October 26 2014, I finally got the time and courage to undertake the transfer operation. This time, the software part was pretty smooth, but the hardware part was a uselessly intricate puzzle. During the process, I wondered myself many times about the purpose of generic hardware if it doesn’t fit well together and pestered about the lack of any viable alternatives.

The sacrifice

Well, my NMedia HTPC case has six 3.5″ drive bays. This is quite nice for an HTPC case. This is possible, because I chose an ATX case, to get a motherboard with rear S/PDIF audio connectors rather than just headers accepting brackets I could get nowhere. This case is a bit bulky; I would build off a MicroATX case if I had to start from scratch.

So installing this SSD seemed obvious at start: just add the drive, transfer the Linux boot partition from the hard drive to SSD, remove the original boot partition, setup GRUB on SSD and tada. No, things are rarely as simple. I thought my motherboard had only 4 SATA ports, and they were all used: one 1Tb hard drive, a second 1.5Tb hard drive, a third 3Tb hard drive, then a blu-ray reader/writer. Why so many hard drives? Well, I am ripping and storing all my video disks, even the huge blu-rays, to avoid the need for searching for them on shelves.

Even if I remembered correctly I had six ports on the motherboard (two are free!), my PSU only had four SATA power connectors, so I would not be able to easily and reliably connect all my drives. I could try to find some splitter cables or molex to SATA adapters, but that would add a factor of failure. I could replace my PSU for one with more SATA power cables, but it would also have more molex cables, PCI Express connectors, etc. Unless I went with a more expensive modular PSU, all these cables would have cluttered my case.

Safest and cheapest solution was to sacrifice one of the hard drives, the 1Tb one of course, the smallest. I thus had to move files around to have less than 120Gb of stuff on the hard drive that would be moved to SSD. That process took a lot longer than I thought. My poor HTPC spent the whole Saturday afternoon copying files around! Fortunately, this is machine time so I had plenty of time to experiment music creation with my still new UltraNova synthesizer combined with Ableton’s Live multri-track ability.


On Sunday, I first burned the Ubuntu 14.04 ISO on a DVD. Yes, Ubuntu is now large enough to fit only on a DVD. After that, I shut down my Minecraft server running on my HTPC and moved its files to another old PC. I started the server on the old PC and reconfigured the port mapping on my router. This way, if my friend wanted to kill a bit of zombies and creepers while I was installing my SSD, he would be able to do so and I would not be stressed if something bad made my HTPC out of service (like something stuck in the CPU fan breaking it).

I then removed the cover of my HTPC and spent quite a bit of time trying to figure out what was the 1Tb hard drive. Based on the position of the SATA connector on the motherboard, I presumed that was the left-most drive. I thus had to disconnect the drive in the middle bay, and use the freed up power and data connectors to hook up my SSD. I then booted up the machine.

Following picture shows the drive temporarily hooked up.


Then I remembered about my old 22″ LCD that I stopped using after purchasing my Dell touch screen. I went pick it up in my computer room, put it on my kitchen table and plugged it in. This way, I would be able to have the screen right in front of me, keyboard on the table, rather than in front of my 46″ LCD HDTV with the keyboard on my knees.

The SSD and the LCD hooked up, I booted up my HTPC and quickly sticked the Ubuntu DVD in the blu-ray drive. After an incredible amount of time, the machine finally booted up into Live Ubuntu DVD!

Data transfer

After Ubuntu started, I launched GParted and realized that I chose the wrong hard drive. The 1Tb drive containing my Ubuntu setup was disconnected. Oh no! So that means I will have to turn off the machine, connect the right drive and wait once again for this stupidly long, almost five minute, DVD-based boot? No, not this time! Feeling like a cowboy, I decided to try something: drive hot swapping. This is possible with SATA, so let’s see! I thus disconnected the 1.5Tb hard drive, starting with the SATA data cable, then the power cord, then hooked up the 1Tb drive. Hearing the old hard drive coming back to life was kind of thrilling. Everything went well, no cable stuck into my CPU or rear fans, and the PC didn’t freeze like it would do with IDE. The hot swap worked.

After that, this was relatively straightforward. As with my main PC, I used GParted to transfer my Linux boot partition and reconstruct the layout. I fortunately remembered, before, to reset the partition table. If I didn’t do that, the GPT that was on my SSD would have caused booting issues that would have drove me mad! I would probably have ended up reinstalling everything, angry against Ubuntu, the technology and probably the whole human kind. A single step, recreate the msdos partition table from GParted before the transfer, saved me that!

Following picture shows my LCD on which we can see the progress of the transfer.


See how bulky was this setup: HTPC on the floor, case opened, SSD hanging on top. Hopefully it was possible to make this setup clean once again after all this.


The home partition: too big to fit on the SSD

Unfortunately, GParted didn’t want to transfer my home partition to the SSD, because it was obviously too large. I could have shrunk it in order to copy it, but I wanted to avoid altering the hard drive in case something bad happened. I thus instructed GParted to simply create a blank Ext4 partition and used cp to perform the copy. The following terminal session shows how I managed to do it in such a way that all files metadata (timestamps, ownership, permissions) was preserved.

ubuntu@ubuntu:~$ mkdir /media/old-home
mkdir: cannot create directory ‘/media/old-home’: Permission denied
ubuntu@ubuntu:~$ sudo mkdir /media/old-home
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sda

Disk /dev/sda: 1000.2 GB, 1000204886016 bytes
255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000e2c4d

   Device Boot      Start         End      Blocks   Id  System
/dev/sda1   *          63    40965749    20482843+  83  Linux
/dev/sda2        40965750    81931499    20482875   83  Linux
/dev/sda3        81931500  1953520064   935794282+   5  Extended
/dev/sda5        81931563  1943286659   930677548+  83  Linux
/dev/sda6      1943286723  1953520064     5116671   82  Linux swap / Solaris
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sda5 /media/old-home/
ubuntu@ubuntu:~$ ls /media/old-home/
eric  lost+found  mythtv
ubuntu@ubuntu:~$ sudo fdisk -l /dev/sdg

Disk /dev/sdg: 120.0 GB, 120034123776 bytes
255 heads, 63 sectors/track, 14593 cylinders, total 234441648 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disk identifier: 0x000d9a0c

   Device Boot      Start         End      Blocks   Id  System
/dev/sdg1            2048    40968191    20483072   83  Linux
/dev/sdg2        40968192    81934335    20483072   83  Linux
/dev/sdg3        81934336   234440703    76253184    5  Extended
/dev/sdg5        81936384    92170239     5116928   82  Linux swap / Solaris
/dev/sdg6        92172288   234440703    71134208   83  Linux
ubuntu@ubuntu:~$ sudo mkdir /media/new-home
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg6 /media/new-home/
ubuntu@ubuntu:~$ sudo cp -a /media/old-home/* /media/new-home
ubuntu@ubuntu:~$ ls -a /media/new-home/ -l
total 36
drwxr-xr-x  5 root root  4096 Oct 26 19:49 .
drwxr-xr-x  1 root root   100 Oct 26 19:44 ..
drwxr-xr-x 69 1000 1000 12288 Oct 25 22:52 eric
drwx------  2 root root 16384 Sep 26  2009 lost+found
drwxr-xr-x  3  122  130  4096 Jan 24  2011 mythtv

The main idea is to mount both the old and new partitions, then use cp with -a option and root access (with sudo) in order to preserve everything. The operation went smoothly.

The boot loader

Even after copying all Ubuntu-related data from my old hard drive, my SSD was still not bootable. To make booting off the SSD possible, I had to install GRUB. Unfortunately, reinstalling GRUB on Ubuntu is not as simple as it should be. If there is a package doing it, why isn’t it built into Ubuntu’s image? Maybe because for most setups, reinstalling from scratch takes 15 minutes. That’s true, but then how about tweaks to fix mouse pointer too small, make XBMC work with S/PDIF sound, reinstall MakeMKV, etc.? Each step is simple, at least when no unexpected difficulty creeps in, but the sum of small things to tweak makes it long.

So let’s avoid this by running the following!

ubuntu@ubuntu:~$ sudo mkdir /media/ubuntu
ubuntu@ubuntu:~$ sudo mount -t ext4 /dev/sdg1 /media/ubuntu/
ubuntu@ubuntu:~$ sudo mount --rbind /dev /media/ubuntu/dev
ubuntu@ubuntu:~$ sudo mount --rbind /sys /media/ubuntu/sys
ubuntu@ubuntu:~$ sudo mount --rbind /proc /media/ubuntu/proc
ubuntu@ubuntu:~$ sudo chroot /media/ubuntu
root@ubuntu:/# grub-install /dev/sdg
Installing for i386-pc platform.
Installation finished. No error reported.
root@ubuntu:/# update-grub
Generating grub configuration file ...
Warning: Setting GRUB_TIMEOUT to a non-zero value when GRUB_HIDDEN_TIMEOUT is set is no longer supported.
Found linux image: /boot/vmlinuz-3.13.0-37-generic
Found initrd image: /boot/initrd.img-3.13.0-37-generic
Found linux image: /boot/vmlinuz-3.13.0-36-generic
Found initrd image: /boot/initrd.img-3.13.0-36-generic
Found linux image: /boot/vmlinuz-3.13.0-35-generic
Found initrd image: /boot/initrd.img-3.13.0-35-generic
Found linux image: /boot/vmlinuz-3.2.0-61-generic
Found initrd image: /boot/initrd.img-3.2.0-61-generic
Found linux image: /boot/vmlinuz-3.0.0-17-generic
Found initrd image: /boot/initrd.img-3.0.0-17-generic
Found linux image: /boot/vmlinuz-2.6.38-12-generic
Found initrd image: /boot/initrd.img-2.6.38-12-generic
Found linux image: /boot/vmlinuz-2.6.32-25-generic
Found initrd image: /boot/initrd.img-2.6.32-25-generic
Found linux image: /boot/vmlinuz-2.6.31-21-generic
Found initrd image: /boot/initrd.img-2.6.31-21-generic
Found linux image: /boot/vmlinuz-2.6.28-16-generic
Found initrd image: /boot/initrd.img-2.6.28-16-generic
Found memtest86+ image: /boot/memtest86+.elf
Found memtest86+ image: /boot/memtest86+.bin
Found Ubuntu 14.04.1 LTS (14.04) on /dev/sda1

The main idea here is to create a chroot environment similar to my regular Ubuntu setup, then install GRUB from here. I ran update-grub to make sure any disk identifier would be updated, pointing to the SSD rather than the old hard drive. Unfortunately, a small glitch happened: update-grub detected the Ubuntu setup on my hard drive. To get rid of this, I had to unmount the old hard drive and unplug it! After rerunning update-grub, I got the correct configuration.

Updating mount points

Since I rebuilt the home partition rather than copying it, its UUID changed so I had to update the mount point in /etc/fstab. I thus had to run the following:

root@ubuntu:/# cd /dev/disk/by-uuid/
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg6
lrwxrwxrwx 1 root root 10 Oct 26 15:37 fb543fcb-908a-463d-bc1f-896f1892e3ad -> ../../sdg6
root@ubuntu:/dev/disk/by-uuid# ls /dev/disk/by-uuid/ -l | grep sdg1
lrwxrwxrwx 1 root root 10 Oct 26 16:11 54f4cbd6-aed0-4b43-91c0-2f8d866f3ee3 -> ../../sdg1

After I figured out the UUIDs, I had to open up /media/ubuntu/etc/fstab with gedit and make sure the mount points were correct. I only had to update the UUID for the /home partition.

Test boot

After all these preparatory steps, it was time for a test! I thus powered off the computer and made sure my old 1Tb hard drive was unplugged and my new SSD was hooked up. I turned on the PC and waited for the forever lasting BIOS POST. Why is it so long to boot a desktop while a laptop almost instantly hands off control to OS? After BIOS handed off control to OS, I got a blank screen with a blinking cursor, nothing else. I tried a second time: same result.

So after all these efforts, do I really have to format and reinstall from scratch? It seems so. Before doing that, I rebooted my machine once again and entered into BIOS setup by hitting the DEL key. Once there, I looked at the hard drives and found that the SSD was hooked up but it was not at SATA port 0.

I turned off the machine and connected the drive into a different port, what seemed to be the first. Looking into the BIOS setup again, my SSD was now at port 0. Ok, let’s try that a last time!

After a blank screen lasting too many seconds for a SSD boot and making me fear for a frustrating reinstall, the Ubuntu logo appeared, and my desktop finally came up! A quick check confirmed me that all the hard drives were present, except of course the disconnected 1Tb one. The SSD was ready to be installed into the machine!

The hardware part

The downside of SSD is that they seem not to fit in any regular desktop cases, only in laptops! This is a very frustrating limitation. Why are these drives all 2.5″ or why cases don’t have 2.5″ bays? When I shopped for my computer case, only high end ones had the 2.5″ bays and that was coming with fancy mechanisms to make the drive removable without plugging any cables, something adding into problem factors. Maybe at the time I am writing this post, some cases with SSD bays are available, but that doesn’t matter; I won’t change my case unless I really need to!

Before installing the SSD, I first removed that old 1Tb drive. I just had to remove four screws from my drive cage and slide the drive out.



To help me install my SSD into my HTPC case, I had a bunch of screws as well as an OCZ bay bracket. Just screwing the drive into the adapter’s tray took me forever, because I had trouble finding screws that fitted, there was a screw in one of the SSD hole I don’t know exactly why and that took me almost five minutes to realize. I then had trouble aligning the screws with the hole, was getting more and more tired and prone to drop screws, etc. At least, the screw I dropped fell on my table so I didn’t have to search it on the floor forever.

Following picture shows the drive in the bracket.


Then I had to screw that assembly into the drive cage of my case. Unfortunately, the upper bays of the cage only offer bottom holes while the SSD adapter has only side screws! I thus had to screw the adapter in one of the bottom bays, which are definitely suited for hard drives with their rubber pads to absorb vibration. None of my screws fitted well. It seems that the SSD adapter has holes smaller than normal while the screws for the drive bay are larger than usual! I got it after more than 15 minutes of attempts. I thought many times I would have to postpone this job and wait for my father to come by with a drill and make some new holes into the SSD bracket or the case.

Following picture shows the drive in the cage.


I don’t know exactly how much time I spent on this installation, but at the end, I was tired and was asking myself if all this would be worth it in the end.

Well after the SSD was screwed and the drive cage back into my HTPC case, I realized I wouldn’t be able to hook up my four SATA drives! No matter what I tried, there was always one drive lacking power. This was because the SATA cable coming out of my power supply unit were too short to accommodate the drive layout I came up with! Ok, I’m at a dead end now.

Before giving up and bringing that beast to a computer store in the hope they would figure out a way to hook the four drives up (maybe with some extension cable I don’t have, or using a new PSU), I remembered that the 1Tb drive I removed was in the middle upper bay which was now empty. My only hope to get the drive powered this day was thus to move one of my hard drives there. Ok, so let’s remove the cage again and play with the screwdriver once more!

I moved my 3Tb drive from the side bay to the upper one and put the drive cage back into my case. I was then able to hook up power. Reaching the SSD drive in the side bay to hook up SATA cables was a bit tricky, but I finally got it. A last check confirmed that all my drives were hooked up, except my blu-ray writer. Ok, just a cable to plug in, and that was it!

Was this all worth it?

After all this hassle, I asked myself this question. When I booted up the machine, it seemed as slow as with the hard drive. What? Maybe the CPU is too slow, after all. But when I reached the desktop and started XBMC, I felt the system was more responsive.

More importantly, the machine became a lot more silent. Since a few weeks, this HTPC was making a lot of noise. I thought it was the CPU fan stressed out by the Minecraft server running on the system, but the 1Tb hard drive was contributing to the noise as well. I suspect it was emitting more and more heat, causing the temperature to raise inside the case and heating up my poor little CPU. The CPU fan was then reacting by spinning like crazy.

Even after I restarted my Minecraft server, the sound didn’t come back. I am still surprised by this effect which I didn’t expect.

This 1Tb hard drive is definitely getting old and emitted some suspect sounds a few times. I am wondering if it would have failed and died if I left it in the machine. This SSD move thus saved me an unexpected reinstall and will help me have a better time with this HTPC.

So yes after all it was worth it!