There is not much that makes me more angry than creating artificial barriers to exit for the purpose of forcing clients to stay with you. Godaddy is a prime example of this. It is a huge pain to move a website and all the files away from them.
I have a client who has outgrown Godaddy’s hosting and I am moving them over to MediaTemple, who provides scalable hosting for high traffic and database intensive websites.
The site is made up of four open source web based applications:
- WordPress for content management
- OpenX for banner advertisement serving
- phpBB for web forums
- Gallery2 for a photo gallery
Each of these applications has a separate file structure and an individual database on Godaddy. Much of the content has been uploaded to the application and the easiest way to move the site would be to Zip or Tar the entire site, backup the databases and move the files over to the new hosting. I have probably done this thirty times in my life. It is pretty straight forward and most of the time is spent watching files transfer.
Godaddy has made this impossible for the following reasons.
- They do not allow you to create Zip or Tar archives larger than 20 mb with their “file manager” tool
- You can not run Tar or Zip from the command line without enabling SSH access, which requires that you delete all your databases for some reason. I have seen some restrictions on SSH access before, but nothing like this. One of my favorite host, BlueHost, requires that you provide a scan of your driver’s license before they enable it. MediaTemple, makes you login to a webpage to enable it. Godaddy gets a strike against them for their stupid requirement.
I tried to archive the entire site for about an hour and tried various fixes I found on google, such as phpshell, cron, shell script. To be fair, I did not try this approach, How to backup Godaddy, but it looked like it had the same limitations as the other. After all that, I gave up and decided to just FTP the whole mess to my new server. I logged into Mediatemple via SSH and connect via FTP to Godaddy via the FTP command from the command line.
Connected to websitename.com.
220———- Welcome to Pure-FTPd [privsep] [TLS] ———-
220-You are user number 6 of 50 allowed.
220-Local time is now 09:25. Server port: 21.
220-This is a private system – No anonymous login
220 You will be disconnected after 3 minutes of inactivity.
The command mget (for multiple get) kept timing out and I never could get very far into the process. Nothing is worse then getting part of the directory structure or missing files. It will take weeks to track down missing includes and recreate the mess that missing files would result in.
With direct FTP access a failure I started to loose hope, but after a bit of googling I found a solution with my old standby wget.
Wget is a, according to their website, “a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.” It can be installed on most OS X, Linux and Windows.
Normally, I use it to make a backup copy of a website because it can spider through a website and download all the files, images and other pieces. It also can turn a website with .ASP or .PHP files into an HTML website, by renaming the files and updating the links. Overall wget is a bad-ass tool and can do a lot to save time and fix problems. I had never however used it to FTP from a flaky ftp site.
Here is how I did it:
From the command line of the Media Temple site I rand the following command: (obviously, you need to use your FTP username and password)
wget -rv -nc –timeout=15 –random-wait ftp://*username*:*firstname.lastname@example.org
Here is how it breaks down,
- -r is to recursivly download all files
- -v is to give verbose feedback, so I can see what is going on
- -nc is to “No Clobber” or do not redownload files if the command is restarted
- –random-wait is to keep the Godaddy server guessing in case they have some sort of automated script to block this sort of this
- –timeout is to set the timeout to 15 seconds. By default the timeout is set to 900 seconds on a read of a file. By setting this low, it will give up quickly and try again.
BAM, the files are all downloading and starting to show up on in my MediaTemple site.
The next step is to go get some lunch while it all downloads, then backup the databases and move those files. Then all I have to do is set up the domain on Media Temple and move over the files. Moving the files via command line is as simple as “mv oldfiles/* newlocation/” This is certainly easier than the hurdles that Godaddy set up for me.
Half way though the process, I did get an error for
“421 Too many connections (2) from this IP” but I let is sit for a while and reran the command and everything was fine.
All I have to say is *%$()*) Goddady, I want the last four hours of my life back.