For the SimpleDesk website, I made a very easy to use and very sleek download manager. Complete with branch, version, file and mirror management. Simply put, this thing is very powerful and flexible. While I didn’t add it in, I could easily expand this script to manage multiple pieces of software as well.
Author Archives: Jeremy
Time to move my mysql data directory to another drive. So its up to some simple commands to get me started.
First my my.conf file.
$ sudo mv /etc/mysql/my.cnf /home/configs
$ sudo ln -s /home/configs/my.cnf /etc/mysql/
I should note that the way I installed mysql (apt-get), a debian.cnf file is created. I haven’t even bothered to see if this file is actually used by ubuntu. But none the less I need to copy it as it contains a mysql user/password for use by the system. Which isn’t really safe considering it is a root account. Setting open_basedir restrictions help with that though. As well in the mysql.conf folder a mysqld_safe_syslog.conf file exists, I don’t use safe mode so I don’t care about it.
$ sudo mv /etc/mysql/debian.cnf /home/configs
$ sudo ln -s /home/configs/debian.cnf /etc/mysql/
Now for a quick test, I restarted mysql via the restart command. Very helpful command and is easier to type then using init.d
$ restart mysql
Everything still works. So now for the final touch. Moving the directory.
$ service mysql stop
$ mv /var/lib/mysql /home/data
$ ln -s /home/data/mysql /var/lib
$ service mysql start
Now I won’t lie, at this point something went horribly wrong. I have yet to figure out why. I have done this many times before and never had an issue. After trying everything I could think of to get mysql started, get rid of the errors and even moving it back, I still had no luck. I ended up restarting the entire box and after that things just worked. So I tried again and then everything worked just fine the second time around. I have no clue why it failed the first time.
Just to add a finishing touch, I edited the /home/configs/my.cnf and changed datadir in it to point to /home/data/conf
That takes care of that. Next is to figure out all the configuration files I need to duplicate over for my mail setup. Hopefully after all that, my web site should be able to easily switch from ubuntu to another operating system and be up and running in no time.
A random thought has hit me. Most people try to keep their MySQL user credentials secure. But why? If a server has been setup properly, it becomes a mute point.
The idea occurred me when thinking about opening a sites source code up. If I opened the site up, I could give them access to my settings and configuration files. These files also contain mysql user credentials. So either I attempt to remove those, or I disallow access. However, I then wondered why even worry.
I will use my own site as an example. If I give out my MySQL user credentials to my inactive forum, what good would it do someone? phpMyAdmin is secured behind a HTTP_AUTH page (over SSL) before you can supply the MySQL user credentials. I have configured all my MySQL users to only allow localhost connections, so only connections from my server alone are allowed.
So if somebody had my MySQL user credentials, they would be completely useless. If they managed to exploit the server and upload files that do malicious stuff, they would most likely be able to have that script find and read the settings file. That being if it was somewhere in the open_basedir restrictions for that site. If they managed to exploit the server, they could do more damage then logging into mysql. Although since only I have a login to my site (secured behind SSH), I have very few files that apache can edit or write to that is web accessible. To fix any mysql damage they did, all I need to do is restore all mysql data (users as well) from a backup. File damage is much worse as it is easier to leave a backdoor into the system then.
Although I don’t run any control panel and use phpMyAdmin simply for ease of access, other sites that run admin panels such as cPanel also apply. Unless they have the cPanel login information, the user installed phpMyAdmin for some reason or configured their mysql users to have outside connections, the data is useless. With the exception being if an attacker was able to upload a malicious file
For shared servers, this could be a worry if your MySQL credentials are publicly known and a hacker happens to also have a site on your shared server. So my above points will have little value if your server is shared. Shared servers carry a risk and that risk means attempting to protect all your credentials more heavily, as an attacker could simply be on the same server as you.
Of course this all depends on the server admin and webmaster having properly setup things such as access to phpMyAdmin and other scripts before hand. However I think this still provides a good point that even if MySQL credentials are publicly known, they still don’t offer much.
Like all good conversations, this was brought up in IRC.
Microsoft decided it would be a good idea to install a hidden addon to firefox installs for those who have some services installed. The major point being it is a hidden unknown addon that you can not remove yourself. How friendly is that?
Google’s Chrome does the same thing. After first running it, I discovered that it installed a hidden service to my user’s Application Support (Mac OS X) folder. I ran a few commands as root to remove the file and chmoded it to “0000”. I also removed Chrome as well and checked all files it modified.
Other than being totally shocked that Chrome is installing a service without my permission, I am questioning continuing to use any of google applications now. This would of all been ok, if Google Chrome had asked for permission to install a service that supposedly “checks for updates”. Of course I wouldn’t of allowed it anyways, I have enough services running on my poor laptop and I don’t need to add a useless one.
Hopefully both Microsoft and Google get it straight. Although I can’t say much about Apple who forces you to install QuickTime and Apple Update on windows. So maybe all three need to get a clue. I want to know what you are doing to my system. Keeping this up will only make me move to full time linux usage more and more.
As a quick end note. Procrastination paid off, as I haven’t run windows updates in about 2 weeks. Just goes to show that sometimes procrastinating can be a good thing.
My current host has a unique feature in which it allows me to setup virtual machines easily. Since that is easily possible, I may want to someday switch to another operating system. So I wanted to split all my /home and configuration files out onto a separate drive. Which is entirely possible with my host.
The nice thing about linux systems is since they operate on open source software, things like configurations and setting things up are becoming less of a problem. So If I set up my files correctly and use some correct symlinks, I could easily switch my operating system without missing a beat.
I will avoid discussing the details of getting the other drive setup on my machine. However, getting to working properly does take a little bit of work, all of which is easy.
Firstly, after I made sure the new drive exists in /dev, I simply created a folder to where I would mount the files.
$ mkdir /home-new
Now the directory exists, I simply just mount the drive to the directory.
$ mount /dev/xvdc /home-new
Doing some basic commands, I tested to ensure that the drive works and is functioning properly. The next step involves copying files over. However I had my site setup with permissions already and copying them would result in them being owned by root again. Luckily the copy command has a argument that allows us to preserve that.
$ cp -Rp /home/* /home-new
Once that completed, I ensured that the files all worked on the new drive. Next was to edit my /etc/fstab so the drive would mount correctly on reboots. Simply put, I just copied the one for my root drive, changed the /dev device to the correct drive and the mount point to /home. Just incase something went wrong, I shut off apache and moved /home to /home-old with a move command.
Now, I could of easily umount the /home-new drive and remount it on /home. But just to ensure everything worked, I issued a reboot command and waited for my server to reboot. After the reboot, I was able to see my site working again. However I was not done yet. My apache configuration files are still on the main drive. An easy way for me to get around this is moving all my virtual host configuration files to my home folder and creating a symlink to them.
$ mkdir /home/configs
$ mv /etc/apache2/sites-available /home/configs
$ ln -s /home/configs/sites-available /etc/apache2/sites-available
This completes the move of my apache configs. I modified the default configuration and have it containing things like port and other apache configuration changes. I just simply repeated this for other configurations I changed and wish to have them transfer if I switch operating systems.
The only thing left to do is change where my mysql data is being stored. Although I will work on not breaking that the first time around some other day 🙂
WordPress by default doesn’t protect its wp-includes and wp-content folders. While WordPress doesn’t do stupid things in most of these files, they still don’t do a simple defined check to ensure we came from an a privileged file. SMF does this and it prevents direct loading of any of the Source files.
To get around this is not as simple as it should be. To start with, I added a “.htaccess” to my “wp-includes” folder with the following contents.
Deny From All
However, that broke the built in rich editor in WordPress. So, now to edit “wp-admin/includes/mainifest.php” and change the following.
All I did was change .php to .js since after reading the directory I came to figure out the .php version is just a compressed version. I removed the “$zip&” part as well since it didn’t make sense to keep it anymore. the “c” argument just tells it whether to compress or not. So this is my resulting change
However, since I was loading some content from my includes folder now, a tweak needed done to my “.htaccess”
Deny from all
Allow from localhost
Simply put, that will deny access to all php files in my “wp-includes” folder. That worked and a simple duplication of the file to my “wp-content” folder produced the same results. However, I still wasn’t done. A simple .htaccess password protected directory for my “wp-admin’ would offer a very basic block to help prevent unauthorized access. Although it isn’t using a very strong password or username on it, it still prevents the fly-by attacks.
AuthName “Restricted Access”
Now I just simply needed to populate that file. Since I have apache installed on my laptop, I simply opened Terminal and ran “htpasswd -n username” and gave it a password at the prompts. Then I simply just copied the line from the window to my .access file and saved. Everything works and my entire wp-admin folder is protected from unauthorized web access.
However, “wp-login.php” contains three calls to css files in the wp-admin folder. A “login.css”, “colors-fresh.css” and “logo-login.gif”. Simply copying those three files to my theme is half the problem resolved. Then just modifying wp-login.php to directly call those files rather than the functions that previously called them. “login.css” needs to be modified and the path to the logo-login.gif file needed adjusted.
This is a simple way to setup a secure login for WordPress. Simply editing “wp-login.php” and looking for:
/** Make sure that the WordPress bootstrap has run before continuing. */
require( dirname(__FILE__) . ‘/wp-load.php’ );
Add after that:
Now when accessing login and registration pages, the browser redirects to the secure version.
After looking into “wp-settings.php” some more, I found the following setting:
if ( !defined(‘FORCE_SSL_ADMIN’) )
Copying the define line to my “wp-config.php” and changing false to true has ensured this will work even if “wp-login.php” ever gets updated.
When I started my site, I knew that I would rarely see visitors. It is more of a personal test site then it is for anything else. I recently decided to get rid of my own site and get a blog. Mostly because my site isn’t really for communication amongst many individuals, rather that its for my own discussions and fun. So in this aspect, a blog makes more sense doesn’t it.
I wanted to setup SVN on my server. Why you ask? Well just because I can really. The most important reason is to get my mods and other files into a repository that would also act as a backup. I set it up on my site as I never saw the point in keeping on my own system.
Luckily, like most linux systems, on Ubuntu I can do this without breaking a sweat. I won’t go into why I am running ubuntu. I just felt like using Ubuntu as my server software of choice. Although I plan on looking into Debian.
When I was going to setup, SVN I decided to set it up with dav. Mostly because it would be easier for me to give out urls to the svn.
$ apt-get install subversion libapache2-svn
After that quickly ran and I accepted it to download the files, I was almost done. I setup a svn repository and did an initial commit into it. Although I had options for how to setup access, since I would be the only one committing to it, I just setup the very basic setup for access.
I had to setup my self signed SSL certificate so I could continue setting up svn. That is as simple as running the openssl command with the correct options. I did a google search since I was too lazy to read the manual.
$ openssl req -new -x509 -days 365 -nodes -out /etc/apache2/ssl/apache.pem -keyout /etc/apache2/ssl/apache.key
Although I should of generated a 2048 key instead of the 1024 key. After that, it was very simple to complete the setup. I just needed to setup my virtualhost for svn and I was on my way.