OpenSUSE Linux Rants

OpenSUSE Linux Tips, tricks, how-tos, opinions, and news

My Resume  -  My LinkedIn Profile

August 31, 2006

Database Connection Class in PHP for your local LAMP stack

by @ 6:03 am. Filed under SUSE Tips & Tricks

Man, school is kicking me in the face. I’m working (at least) full-time, and taking 16 credit hours at school. Dude, who thought that was a good idea?

Well, once upon a time (maybe it didn’t even happen) I seem to remember that I said I was going to post some code for all to have, see, and benefit from (I hope). It is a database connection class that I wrote in PHP and have been using for several years. With how many databases I have worked with in all that time, it’s been robust enough to handle everything I’ve ever needed.

There is a ton of explanation I could do, had I the time. However, it does have a lot of comments in it, so I’ll be somewhat abbreviated here, as I’m super short on time.

First, you’ll need to create a user on your database for use with this class. This is generally done with a command similar to the following:

GRANT ALL PRIVILEGES ON [DATABASE NAME].* to [USERNAME]@[HOST] IDENTIFIED BY ‘[PASSWORD]’ WITH GRANT OPTION

If my db is called ‘customer’, my username is ‘scottmorris’ and I’ll be using ‘testpassword’ as my password (which I would never, ever, ever, ever, ever really do) and the host I’ll be connecting from is 192.168.0.123, my SQL query may look like this:

GRANT ALL PRIVILEGES ON customer.* to scottmorris@192.168.0.123 IDENTIFIED BY ‘testpassword’ WITH GRANT OPTION

Alternatively, you can create users however you normally do, maybe with phpMyAdmin or something.

In any case, create a user that the database class will be using.

Next, you’ll edit the class. Open it up. On lines 59 through 62, you’ll notice a place to put your server, username, password, and database name. Put in the appropriate values. Next, you’ll want to put the class in the include_path (look in /etc/php.ini for the ‘include_path’ directive) to make sure you can use it in your scripts.

You can now use this database connection class in your scripts. You just include( “dbconn.class.php” ); and then just start using it. It does have documentation inside of it on the simple methods of how to use it.

Now for a couple of features that it has. If you insert a new record with it, a member called ‘insert_id’ will contain the new id of the last inserted record. No more screwing around with how to get this. Second, you can do multiple database transactions without having to open and close the database a bunch of times.

It has two modes, which are ‘true’ and ‘false’. If the mode is ‘true’, it will handle opening and closing the database connection automatically when you run the SQL. If it is in ‘false’ mode, you are given control over when to open and close the database connection.

For example, if you had a ‘for’ loop, and you were going to do 1000 inserts, you would definitely NOT want to open and close the database for each insert. What you would do is something like the following:

//CREATE THE OBJECT
$db = new dbConn();

//TELL THE CLASS TO LET US CONTROL THE OPENING AND CLOSING OF THE CONNECTION
$db->mode = false;

//OPEN THE CONNECTION
$db->openConn();

//RUN THROUGH OUR 'FOR' LOOP
for( $x = 0; $x < 1000; $x++ ){

	$sql = "insert into [whatever your query is]";
	$db->execute( $sql );
	
}

//CLOSE THE CONNECTION
$db->closeConn();

//TELL THE CLASS TO RESUME CONTROL OF AUTOMATICALLY OPENING AND CLOSING CONNECTIONS
$db->mode = true;

Obviously, you’d have your query there set up to do whatever you needed done. It may be that you are looping through some kind of array to do inserts or something. But that way, you could do all your inserts without beating the living snot out of your database server.

I sincerely apologize for the horrible documentation/instructions I’m providing. However, ya’ll are smart cookies. Besides, if you have questions, you can always leave them to me as comments, and I’ll answer them so that everyone can benefit who is interested.

Also, with school, I may only be able to make an appearance here once or twice a week, unfortunately.

Heh, before I forget, here’s a link to the database connection management class. Uncompress it with this command:

tar -jxvf dbconn.class.php.tar.bz2

Please do let me know if you have questions, problems, comments, or whatever. My hope is that this class will make someone’s life easier.

August 23, 2006

Reverse Tunneling with SUSE Linux 10.1 and SSH

by @ 6:21 am. Filed under General SUSE, How-To, ssh tips, SUSE Tips & Tricks, Work-Related

So now, we can’t have personal computers on the company network. This “protects against viruses being introduced to the company network,” was the explanation given to me. Nevermind that I am running Linux which isn’t susceptible to them, and certainly doesn’t perpetuate them. So my desktop is on the regular old class C company subnet (192.168.0.x), and my laptop has to be on the wireless network, which is on a completely separate subnet (192.168.1.x). Obviously, there is no way to route traffic between the two computers. So what do you do? Time to whip out the SSH tunnel, again. Only this time, it’s a reverse tunnel.

The idea before is that we are setting up a machine inside the network to forward traffic to a computer outside the network, which then sends it to somewhere else. This time, we are setting up a computer outside the network to forward traffic to a computer inside the network. Then, we just connect to that computer, and the traffic is automatically forwarded in to the other computer.

As in my example, I set up a tunnel between my desktop machine on the 192.168.0.x subnet and my server. I told the server to forward all SSH connections that hit port 10000 to port 22 on the desktop computer. Then, I just SSH in to my server from my laptop, and my request actually ends up at my desktop computer. Because I’m using KDE and fish://, it’s essentially just like browsing a network fileshare on a local subnet, because Linux can do stuff like that.

Sounds exciting, and indeed it is. If you’d like to read the tutorial I used to set this all up, head on over here. Fun stuff, baby.

August 22, 2006

SUSE and subversion

by @ 7:07 am. Filed under General SUSE

I thought that it was cool that US retailer W.S. Badcock is switching to Suse for 320 stores. Keep it rockin’, baby.

I spent Friday installing subversion. THERE is some cool software. I spent half the day reading the subversion documentation. Then, when I felt the ADD kick in, I knew I couldn’t read anymore. I had to get started with the hands-on portion. I fired up YAST and installed subversion without a hitch. I got a repository created and edited the config and password files.

What’s cool is that subversion caches usernames and passwords so that you don’t have to type them in 10 thousand billion times everytime you want to check files out or in. I could check stuff out but I couldn’t check them in. After a few minutes, I discovered that the default configuration makes it so that repositories are read-only. After changing a config file to allow commits, I was crackin’.

Right on, subversion is the man, just like SUSE Linux.

August 18, 2006

SUSE Linux running Nagios is pretty cool

by @ 7:03 am. Filed under General SUSE, Work-Related

I have had an absolute blast today. I’ve always heard about how wickedly powerful Nagios is. However, I’ve also heard many versions of different horror stories about how people lost an arm to Nagios when they tried to install it. My friend Steve now has a glass eye from his ordeal with it.

Naturally, of course, not wanting to lose an arm or an eye, I’ve steered clear from Nagios for a while. That being said, my manager came to me the other day and asked me, “Is there anything open-source that can monitor our servers to let us know when things go down?” I was like, “So, now you need Linux to babysit your Windows servers lest they crash?” After the chiding, I said, “Why yes, there is a tool called Nagios.” He said, “How soon can you get it installed?” I said, “About 2 weeks.” The painful look he gave me was priceless. After gleaning as much enjoyment as I could from it, I said, “Just kidding… but probably a day or two,” really having no idea because of what I’d heard about the installation.

The thing installed in about 15 minutes. Big whoop. Then came the configuration of that bad boy. Heh, there’s where I’m guessing people hit the wall. For some reason, it seems to me that if you’ve had experience with object-oriented programming, and relational database experience, the config files kinda sorta seem to just make sense. They did for me, anyway. YMMV.

I also wasn’t able to get the notifications to work, so I wrote my own scripts and plugged them in. They work beautifully. I set up a PING monitor for my desktop machine. I then spent the next 20 minutes turning the machine off and back on to watch the monitor go from CRITICAL to OK and back. Boy, simple minds have simple pleasures. Maybe that’s why my brother grew up eating crayons.

Anyway…

I also couldn’t find a MySQL monitor, so I just used the check_tcp monitor to connect to port 3306 on the target machine. I realize that this does not actually run a query on the database to see if it is actually working properly. However, it will tell me if the server is not running. Maybe I’ll fix that later. For now, it looks good to the untrained eye.

OK, well, I set up like 27 monitors on about 6 different machines. Fun day, tell you what.

Really it was not nearly as painful as I had been led to believe. Perhaps it was the tutorials that I used to get me started. Maybe I’ll write my own (for some reason, I get a charge out of writin’ good, clean, helpful tutorials) for anyone who may find it useful. The two tutorials I used were located on a CoolSolutions page and bobcares.com. Between the two of those articles, from the time I started until the time I was monitoring the local machine was about 25 minutes. Not too shabby.

Oh, don’t forget to install openssl-devel before you do this, otherwise you won’t be able to check your HTTPS servers (using the check_http plugin).

If you dig screenshots, here’s one of my Nagios install in action:

Nagios on Linux in action
Click for larger image

August 16, 2006

What does Linux and frappr have in common?

by @ 7:04 pm. Filed under General SUSE, Linux News

I saw something cool on CoolSolutions (imagine that) that I wanted to share with everyone. The info on the site is as follows:

The coolest thing about open source is the people. If you’ve helped develop, test, promote, or evangelize Linux and open source, we’d love to hear from you. Take a few minutes to put yourself on the map and you could win one of these great prizes:

 

The idea is that you put your info in, and it plots you on a frappr map (it’s just your zip code, no home addresses, so don’t worry). You could also win some slick prizes. The action starts here.

Judge Linux by Current Experience and Nothing Else

by @ 11:47 am. Filed under General Linux, My Opinion

This article rules. People who judge Linux by heresay, M$ FUD, or old experience (even a few months old) need to try a current distro before they can beat on Linux too hard. Especially if they tried the wrong distro. Danijel Orsolic++

Can SUSE Linux solve *YOUR* problems?

by @ 9:17 am. Filed under General Linux, General SUSE, My Opinion, SUSE Tips & Tricks

Approximately five thousand six hundred forty-six and a half times a day, I hear the phrase: What is the best application for __________ ? What is the best program for email? For serving HTTP requests? What’s the best media player? What is the best program for keeping track of when I need to get my manicure, get my hair done, or replace my toilet paper?

Just like most of life, there is generally no perfect, clear-cut, concise answer to these questions. There is generally no absolutely correct answer for everything, all the time, in every single situation. That said, the phrase “What’s the best scripting language?” is not quantized at all. Instead of that, once you say, “What’s the best scripting language for my current situation?” you can then start working with criteria to determine the answer.

The reason why it is helpful to do this is because situations are like fingerprints, no two are exactly identical. Thus, your requirements for the best tool for the job at hand will have a tendency to vary. Because of this, the best solution today at work may not be the best solution tomorrow at school, or the next day at home. How do we then determine what is the best fit for a given situation?

If you ask yourself a few questions, you will almost always arrive at an answer that will fit just right. These questions are as follows:

Is the potential solution designed to do what you are trying to do with it? I used to be an expert with Macromedia Director, a multimedia application similar to Flash. I made it do things you couldn’t believe. Because I knew it so well, I could make it do things it was not designed to do. Quite often, I had a tendency to want to use it as the development platform for projects that it was not a good fit for. Be careful with this one. Don’t use a technology just because you are comfortable with it. If it wasn’t designed for what you are trying to use it for, it is not the best solution.

Are you competent as a user or administrator of the technology? Put bluntly, if you don’t know what you are doing, you will almost certainly not be able to tap the real power of a given solution. Knowledge is power, as we all know. If you don’t know what you are doing, you won’t be able to accomplish the task. You may also look at how long it would take you to become savvy with the technology. If the learning curve will be small, that would be a good thing to consider.

Is the technology resource-efficient? For example, what are the resource requirements involved with implementing this solution? What are the cost requirements? How long will it take to get the solution implemented? What are the hardware requirements of the technology? If you know nothing about the technology, how long will it take to learn everything about it? If something costs many thousands of dollars more than alternative solutions, you may not want to consider it. If it has excessive hardware requirements, this will take money out of your pocket. You will be spending many more thousands of dollars on the latest hardware when you could be using older systems with more resource-efficient solutions. If you spend a year learning how to administrate the technology, that is a huge loss of time and money.

Another question I usually factor in when considering a solution for a given problem is: What is the industry-wide user base of this solution? I do this because in my experience, the number of installations of a given solution is directly proportional to the amount of online help I can get for it, should I get stuck on a problem.

Many people seem to pick a solution that fits the first three questions above, but have no regard for the last question. They have a tendency to choose something that works the best for them, personally. This is not always the solution that is the most widely-used in the industry. Depending on the situation, this can be great, or it can cause heartache later on.

For example, let’s say I’m looking at Linux distributions, and I pick one that is not being used in the industry very much, and is not in that high of a demand in an enterprise setting. When I apply for a Linux administration job, even though I know Linux, it may not be the distribution that that company is using. Thus, I have kind of shot myself in the foot.

Thus, for the last question, you have to kind of weigh whether you want to go with something more mainstream that more people are using or whether you want to go with something that you totally love, that you don’t care who else is using. If you do the former, there may be more demand for it in the job market, and there may be more online support available. If you do the latter, you will have the exact solution that you want and really like, but you better know a lot about it because you are going to be on your own for support.

For me, SUSE Linux answers the first three questions with flying colors. It is super versatile, and so far, has been able to handle whatever I have thrown at it. I am comfortable as an admin with it, and am only learning more with each passing day. As far as resource-efficient, it is free, takes 30 minutes to install, and can run on the oldest of machines.

It’s the last question to which SUSE both answers “yes” and “no”. It is not mainstream, as Windows is the mainstream desktop. That said, SUSE is an incredibly mainstream distribution of Linux.

Evidently, the CIOs of the industry are finding that Linux does much more than cut costs, as well. Spread the word, baby.

August 15, 2006

Software RAID-5 in SUSE Linux 10.1

by @ 6:52 am. Filed under General SUSE, SUSE Tips & Tricks

This evening, I took a few minutes to appreciate how cool SUSE Linux 10.1 is. I did a fresh text-based install on a box with three 40 Gig HDDs, creating a software RAID-5 system. I wanted to see how hard it was. With no X11 install, just purely text-based, it took me about 28 minutes total. Granted, it was a very minimalistic install, consisting of about 450 Meg, and required only Disc 1. If you need a quick server, all you need is that first disc, baby.

Here’s the quick run-down of the RAID-5. It’s a way to make sure your data doesn’t get lost. It provides redundancy for your data. You need at least 3 of the same sized partitions. I had three 40 Gig partitions. You take the number of partitions and subtract 1. That leaves me with 2. Then, you multiply that number by the size of the partitions. That is 80. The final size of my RAID is 80 Gigabytes. If one of those babies goes down, you don’t lose any data. You just throw another 40 Gig partition in there, and it rebuilds your data. Very sweet. I could tell you more about it, but I got most of my information from here.

I thought it would be cool to try out the RAID-5 software raid. I was going to use it as an RSYNC backup server. If it ends up being really stable, I might slap it in as my main web server. It’s about 48 times the machine that that one is, anyway.

Anyone have experience with this software RAID setup?

August 13, 2006

SUSE Linux is spreading on the desktop as Red Hat spins up for desktop competition

by @ 10:44 am. Filed under General SUSE, Linux News, My Opinion, SUSE News

It’s a beautiful thing when a seasoned enterprise Linux distribution feels it needs to compete in the desktop market. This means that they see a generous amount of interest from the consumer base. You see, I think they see the same thing that I see. And Novell sees. This is the fact that, although it is not perfect right now, Linux is only going to develop and mature from where it is now. More and more people are interested in using it, and that interest will only continue to grow. Notice that I did not say that this is the year that Linux overthrows the Microsoft monopoly and takes the market.

Many people use M$ OSes. However, the vast majority of them are basically “follow the herd” users. They use Windows because that’s what everyone else uses and that’s what they have always used, and that’s what they know. All they need is an introduction to Linux on the desktop and be shown around a little and they’ll be plenty content using a Linux desktop. I’ve seen it happen. One such user has been on SUSE Linux Enterprise Desktop 10 for a little over a month, now. He has found it to be quite a pleasant experience. I’m sure it is (and can be) the same with many more people.

Spreading Linux will take a little time, but will accelerate with the continued development and maturation of Linux as a desktop operating system. Linux is not going anywhere. It is just going to get better and more widely used from here. Red Hat and Novell are already making it happen. M$ has already seen its peak of dominance.

August 10, 2006

SUSE Linux 10.1 redeemed – problems caused by faulty hardware

by @ 7:01 pm. Filed under SUSE Blog News, SUSE News, Work-Related

The server story continues. Today, my manager told me to install Windows on that box that has been giving me so much trouble. After coming back in from being violently ill outside, I got to work. Immediately, the Windows 2000 disc asked me for RAID drivers. Sweet… no dice there. Windows fails again. I took that information to my manager who said to install Windows 2003 Server. After coming back in from being violently ill again, I installed Windows 2003 Server on that machine. About 4 minutes after it was finished, imagine my glee when I saw this on the monitor:

SUSE Linux 10.1 rides again

 

As it turns out, the machine is just a dying piece of junk. It would crash on the NIC drivers, then it would crash on the RAID drivers. Windows 2003 Server even crashed on the NTFS drivers.

My manager is giving me a different machine tomorrow so I can put SUSE 10.1 on that bad boy. Woots. Wish me luck.

I got such a kick out of that BSOD that I have added to my Windows Error Gallery. Go check it out.

August 9, 2006

SUSE 10.1 – Ashes, Ashes, we all fall *DOWN*!

by @ 6:20 pm. Filed under General SUSE, Work-Related

Server Update

I learned that the following patch was in the 2.6.17 kernel, a version of which was released as stable on Monday:

commit 57a62fed871eb2a95f296fe6c5c250ce21b81a79
Author: Markus Lidel 
Date:   Sat Jun 10 09:54:14 2006 -0700

    [PATCH] I2O: Bugfixes to get I2O working again
    
    From: Markus Lidel 
    
    - Fixed locking of struct i2o_exec_wait in Executive-OSM
    
    - Removed LCT Notify in i2o_exec_probe() which caused freeing memory and
      accessing freed memory during first enumeration of I2O devices
    
    - Added missing locking in i2o_exec_lct_notify()
    
    - removed put_device() of I2O controller in i2o_iop_remove() which caused
      the controller structure get freed to early
    
    - Fixed size of mempool in i2o_iop_alloc()
    
    - Fixed access to freed memory in i2o_msg_get()
    
    See http://bugzilla.kernel.org/show_bug.cgi?id=6561
    
    Signed-off-by: Markus Lidel 
    Cc: 
    Signed-off-by: Andrew Morton 
    Signed-off-by: Linus Torvalds 

However, the freaking server is still crashing. I have tried using the i2o_block module or the dpt_i2o module on Kubuntu, Gentoo, SUSE 10.0, Knoppix, and SUSE 10.1. I’ve also tried using kernel versions 2.6.15 through 2.6.17 in the SUSE 10.1 install. All of these attempts have resulted in the same exact behavior: the server hangs at random. Dean contacted me (thanks, bro) with a suggestion to update the firmware and bios. As luck would have it, they were already at their latest versions.

Because I have the 2.6.17 kernel installed with the patch that was supposed to fix my problem, I’m beginning to wonder if the problem isn’t related to a hardware failure somewhere. I started a 24-hour memory test today to see if I could find a problem with the RAM. Any other suggestions for things that I could try?

By the way, for hardware diagnosis and tons of other cool tools, I recommend The Ultimate Boot CD. Anyone else have other suggestions?

August 8, 2006

SUSE Linux 10.1 on my server

by @ 6:48 am. Filed under General SUSE, SUSE Blog News, Work-Related

I wanted to thank everyone who has provided great feedback on the ebook that I released last Monday. As many of you know, the influx of HTTP requests took my server to its knees. This happened because my server has limited bandwidth. It could not fill all the requests fast enough, so everything bogged. When I limited the number of connections, everything normalized again. It’s all good, though. It’s good to know that there is that much interest. Hopefully, the ebook is helpful to everyone who wants to learn about how to use Linux.

I have been a bit silent since the release of the ebook. This is mainly because I am focusing on a problem I’m having with a server at work. It has an old Adaptec 2100S RAID controller. This is driven by either the i2o_block module or the dpt_i2o module. Evidently, in SUSE 10.0 (which is what I tried first), both modules load, causing a race condition. In 10.1, the i2o_block module is used. The problem is that when I use this module, the server randomly locks up. I did manage to grab this error during one of those lockups:

kernel BUG at include/linux/i2o.h:1074!
invalid opcode: 0000 [#1]
SMP
last sysfs file: /firmware/edd/int13_dev81/extensions
Modules linked in: ipv6 af_packet edd reiserfs loop dm_mod usbhid ide_cd cdrom i2c_piix4 i2c_core e1000 mii sworks_agp agpgart shpchp pci_hotplug ohci_hcd usbcore parport ext3 jbd processor i2o_block i2o_core serverworks ide_disk ide_core
CPU:	0
EIP:	0060:[]	Not tainted VLI
EFLAGS: 00210282	(2.6.16.13-4-smp #1)
EIP is at i2o_driver_dispatch+0x25/0x1a1 [i2o_core]
eax: 01ba0000 ebx: fffffffe ecx: dfcfec00 edx: 01b90000
esi: dfcfec00 edi: fffffffe ebp: 0000000b esp: c034bf38
ds: 007b es: 007b ss: 0068
Process swapper (pid: 0, threadinfo=c034a000 task=c02ef2c0)
Stack: <0>dfcfec00 00000068 c01277b2 fffffffe dfcfec00 000000b f884d62c
	c1b78840 00000000 c013ff8a c034bfa4 00000580 c0341380 0000000b c1b78840

Call Trace:
 [] do_timer+0x39/0x316
 [] i2o_pci_interrupt+0x22/0x3e [i2o_core]
 [] handle_IRQ_event+0x23/0x4c
 [] __do_IRQ+0x7e/0xd1
 [] do_IRQ+0x46/0x53
 [] common_interrupt+0x1a/0x20
 [] default_idle+0x0/0x55
 [] default_idle+0x2c/0x55
 [] cpu_idle+0x8e/0xa7
 [] start_kernel+0x2b5/0x2bb
Code: 20 75 de 5b 5e c3 55 57 89 d7 56 53 83 ec 0c 89 04 24 8b 90 a4 00 00 00 39 d7 72 0f j8b 0c 24 89 d0 03 81 a8 00 00 00 39 c7 72 08 <0f> 0b 32 04 45 ed 84 f8 8b 04 24 89 fe 29 d6 03 b0 a0 00 00 00
<0>Kernel panic - not syncing: Fatal exception in interrupt

What is funny is that the Kubuntu CD I booted into uses the dpt_i2o module rather than the i2o_block module. Because of this, what I decided to do was to force it to use the dpt_i2o module. Hopefully it won’t lock up as it’s compiling that kernel or making and installing the modules. If anyone has any other ideas on how to address this issue, I’m all ears.

OpenSUSE Linux Rants
Official OpenSUSE Linux Site

internal links:

categories:

SUSE Resources

search blog:

archives:

August 2006
S M T W T F S
« Jul   Sep »
 12345
6789101112
13141516171819
20212223242526
2728293031  

53 queries. 0.726 seconds