Thursday, December 25, 2008

Git it done, Rubyists

3 comments
The small article I did for the Ruby Advent 2008 went up on 9th December. It was aptly named Git it done, Rubyists. I went through the basic steps to create a Git repository and how to play with it using your beloved programming language; Ruby. Two libraries (Grit and Ruby/Git) were given a brief introduction and that's almost it. The article was aimed to be an introductory and a short one. I hope someone will find it useful.

Ruby Advent 2008 is an advent calendar with the Ruby flavour. Lakshan, a fellow Sri Lankan Rubyist organized this years Ruby Advent following the tradition from 2006. I can't judge my own article, but I can assure that the rest of the articles are excellent.

Ruby Advent 2008 was featured in RubyInside, RailsEnvy podcast, and many other sites including RubyFlow and RubyFu. IMHO, this years calendar was a success. A bunch of awesome Ruby community members contributed contents convering many interesting topics. Mast read for anyone interested in Ruby.

For those of you who have no idea what an advent calendar is, here's what Wikipedia says. "An Advent calendar is a special calendar which is used to count or celebrate the days of Advent in anticipation of Christmas". Lakshan and other Ruby developers around the world shared a wealth of information, one article per day in good Cristmas spirit. That's another reason why I also did what I could to give a hand, eventhough my religious beliefs are different.

For anyone interested in Ruby programming, I'd highly recommend the advent series. So head to Ruby Advent 2008 now.
Read More...

Monday, November 17, 2008

Open Source, FOSS Politics, GitHub and rise of a new era

Leave a Comment
This post is a response to a blog post by Chintana Wilamuna. Actually, it's rather an addendum than an answer. I do agree with him. This is something I want to add. It got longer than I thought, but could make good (enough) reading when you ignore the typos. Here we go.

There is no denying that Open Source movement has a great impact on the ICT industry and by proxy, to human society. If you don't believe me, clearly you haven't been paying any attention to ICT or mass media for that matter. Do your homework and if you still disagree, you can contact me if you want. Anyway, my point is that the Open Source movement has been a success. It has, in my opinion induced and/or inspired other phenomena such as Wikipedia and the rise of social networking (Eg: Twitter, Facebook, MySpace, etc.).

An Open Source project is usually a software project which is community driven and self centric (the software itself). The community focus is usually on the software it's developing. This common goal combines with a loose set of standards makes them bridge gaps that would otherwise divide them and build barriers. This is how the human kind created the web server which powers most of the Internet. This is how they created an Operating System which rivals and in many cases, out performs proprietary systems built at the expense of great amounts of money, propaganda, organizing, quality control, etc. This wonderful concept of collaboration, freedom and development today powers many things from the tiniest of embedded computers, mobile phones to the most powerful computers ever built. They power from handhelds, game consoles to massive data centers. All buy utilizing the power of people. Believe it or not, it's a fact.

However, Open Source movement isn't the realization of a Utopian dream. It's just a way of getting things done by collaboration. We as in our nature, have a habit of incorporating a certain notion of order and politics in and group/social activity. This, I'm not going to say is good or bad. But I'm going to say that the Open Source movement has not eluded that. So there is Open Source politics (Free and Open Source Software Politics that is, to be politically correct ;) ).

Traditional political methods usually and largely worked. Some famous successful software projects have known structure and way of doing things (Eg: Apache Foundation, Mozilla Foundation), and it works. Among many techniques used, we can see some dominant model. For example there is governance of projects and governance of the specifically code base. As some of you may know, in a traditional Open Source projects not everyone can be a "committer".

The commit access in the traditional approach, is considered a special privilege which was granted usually in recognition of the skills, talent and more importantly the contribution to the project. It has worked beneficially for may years. This approach has the upside of having low noise, better quality and control of contribution and may others. But it also has the downside of not being able to utilize the possible contributor base and among others, the tendency to annoy / offend contributors and users. The psychological image of an elite team of contributors worked, but also deterred some other possibilities. If someone did not like the way the project was heading, she could grab a copy of source code (remember, Software Freedom) and start developing to fit her will. This process, known as Forking was sometimes viewed as hostile and was thought to cause division of the community. So some good ideas inevitable had to fall through cracks.

So the million dollar question: How do we develop Open Source software, getting contribution from more crowd, enable free flow of ideas and still retain the stability of the project?

The software used in the traditional approach enabled the traditional approach and enforced it. Their relationship was somewhat symbiotic. Certain core components especially like version controlling systems (CVS, Subversion, etc.) could not cater our requirements here. As good as they are (they are good), their philosophy and design usually make them more suited in the traditional, centralised approach.

Enter, Distributed Version Controlling Systems (DVCS). Distributed VCSs are nothing new. They have been around and watched the centralised systems be the de-facto standard in Open Source management. DVCSs enable more collaborative kind of development. It can be argued that this is not essentially a feature of a DVCS and is possible to have in a centralised system. I agree, kind of. Although it might be the case, let's say that the software available so far do not support that arguement.

From the many distributed VCSs around, some of them have more success than the rest. Namely Git, Mercurial, Bazaar and Monotone seems to be more popular than the others. Git and Mercurial seems to have more high profile customers among the lot. Both of the projects share similar goals and is of roughly the same age (Git predates Mercurials in a few weeks). I've blogged about Git before, have been using it for more than a year and is very fond of it.

If you have read so far, you might be thinking that I haven't done justice to the last part of the title of this post, the GitHub thing and the rise of something. This is that part.

Stealing part of a comment by David Heinemeier Hansson (DHH - the creator for Ruby on Rails web application framework), I believe that a Killer App can make a breakthrough for a platform. So GitHub is the Killer App of Git these days. GitHub blasted earlier this year in to the source code hosting scene which was dominated by the likes of SourceForge.net and commercial offerings. It created tremendous buzz even before it was launched.

GitHub is a site providing both paid and free Git version controlling service. So what's Killer or revolutionary about it?

What sets GitHub apart and makes it a trendsetter rather than a follower is the approach they've taken. They go back to the basics of the Open Source concepts in a cool, Web 2.0-esque way, so to speak. GitHubs focus is on the individuals (building blocks of an Open Source community) and their interaction and collaboration. Being powered by Git, they happily drop the dogma related to the concept of forking while passionately crying "Fork You"!

Technically speaking, any Git working copy (in comparison to a Subversion/CVS checkout) is a full repository, which in turns make it a separate fork. So GitHub connects people who work on different projects, let's anyone create her own fork with it and lets her do anything she desires about it. If she feels like she have something to share it later she can ask anyone else (including the original author) to fetch her modifications and try them out. If they like the changes the user made, they can incorporate it to their repository. The beauty it this is (inherited from the DVCS), one can manage the projects as a true distributed way or in a centralised way. Whatever you choose, you could still use the social networking approach to get the most out of it, and avoid the political barrier for contribution.

This is not to tell that the GitHub approach eliminates project politics. It does not do that, but it prevents politics getting in the way with people who want contribute. This I think is the dawn of a new era in Open Source software development. It signifies preservation of the community while giving everyone more value and respect, appeal to a broader audience. We can already feel the waves ripping through making other Open Source projects to try this out. More than anything, this enable the coder to move freely doing that neat hack on the random project they stumbled upon, without having much to worry about projects politics. I know they would agree, it means a lot.


Clarification and a disclaimer: I'm openly a Git/GitHub fan/user and I chose Git after using it and learning it, and that was before the Ruby on Rails community went crazy about Git. So I may be biased, but don't discount anything on Git, GitHub, Ruby, Rails and Open Source movement as a whole. Read and make up your mind.

I haven't discussed any down side of the new approach, which I'm sure is something will be analysed and told in great detail in the may years to come.

This post took surprisingly short time for a long post, so you are bound to run into typos and other language abuses. If you read this far, don't forget to leave a comment. I think I find your ideas interesting. ;)
Read More...

Tuesday, August 26, 2008

Howto Setup a Subversion (svn) Repository for a Rails Project + Bonus

6 comments
Setting up a Subversion (svn) repository is something development teams have to do fairly regularly, not that I want to use Subversion. :) If you listen to me, go use Git. Subversion is undoubtedly very good. But after using Git for about a year, you can't simply get me to switch back. Git is that good. :) I've written about Git before.

In cases where you can't use Git (or you feel too castrated by TortoiseSVN, pardon me for the pun) you can use Subversion. In this post I'll go through the steps you have to follow to get a basic Subversion setup up and running on a CentOS 5 Linux host. However I think you should be able to use this on other Linux distros too.

There's more than one way to host a Subversion repo. I'm going to stick with one way involving WebDAV. Don't mind the buzz word. It's the most common usage for this purpose. If you want a repository where you want to checkout and commit remotely, this is an easy way of getting it done. In addition to that I'll have some information targeting Ruby on Rails projects. However, I think it's not Rails specific, but also useful with other things too.

In addition I'll mention about a problem I came across where act_as_ferret caused an error and what was the reason.


1. Selection of Tools

We are going to use the following tools:
  • Apache HTTPD (Web Server) 2.2
  • mod_dav_svn for Apache
  • Subversion
If these are not installed already, go ahead and install them. You can use any method you like. This is how you do it using YUM tool

# yum install mod_dav_svn subversion httpd

You can verify whether you have them installed via the RPM system by using

# rpm -qa subversion httpd mod_dav_svn

If installed it'll show something similar to
subversion-1.4.2-2.el5
mod_dav_svn-1.4.2-2.el5
httpd-2.2.3-11.el5_1.centos.3
Note: If you didn't use yum/rpm to install, most probably you won't be able to use the above rpm command.

Now that you are set to it let's move forward.


2. Create the Subversion Configuration File for Apache

For this create a file under /etc/httpd/conf.d/ which has a name ending with .conf and enter the following lines
(Eg: /etc/httpd/conf.d/subversion.conf)

LoadModule dav_svn_module modules/mod_dav_svn.so
LoadModule authz_svn_module modules/mod_authz_svn.so

<Location /repos>
DAV svn
SVNPath /var/www/svn/repos
AuthzSVNAccessFile /etc/svn-acl-conf
AuthType Basic
AuthName "UberCool Subversion Repository"
AuthUserFile /etc/svn-auth-conf
Require valid-user
</Location>


Replace the values for the following according to your requirement

Location - The location you want in the URL (In this case: http://example.com/repos)
SVNPath
- The path to the directory you wish to store your repositories
AuthzSVNAccessFile - The file where you would hold the access control details for users
AuthName - A description you like the users to see
AuthUserFile - The file where you would hold the username/passwords for your users

Note: In CentOS 5 files with a *.conf name under the /etc/httpd/conf.d/ will get automatically get loaded. If you want to change the behavior or add another location for configuration files, the magic line is in the /etc/httpd/conf/httpd.conf file
...
Include conf.d/*.conf
...


3. Create Subversion Users and Set Access Control

Here I'll tell how to manage Subversion users and their access levels (basic).

3.1 Manage Subversion Users

Let's now add users to our little still non-existing repo. This step is not mandatory to precede the repo creation.

What we are going to use is basic HTTP authentication. If you want to use different authentication systems like LDAP you'll have to look elsewhere for that.

Use htpasswd command to create a password file and then to add/modify/remove users

Create a Password file:
# htpasswd -cm /etc/svn-auth-conf yourfirstuser

Add/Modify Users:
# htpasswd -m /etc/svn-auth-conf yourseconduser
# htpasswd -m /etc/svn-auth-conf yourotheruser

Remove User:
# htpasswd -D /etc/svn-auth-conf yourotheruser

3.2 Set Access Control

What happens if you want certain users to have full access to the repository, but only read-only access for a few users. You can achieve this level of access control easily. This is the purpose we mentioned a file called /etc/svn-acl-conf in the configuration file.

Edit the file with your favorite editor ans enter the details. I'll put some data I have, replace these with your actual users. Please note that "r" stands for Read and "RW" for Read-Write.

[/]
yourfirstuser = rw
yourseconduser = rw
yourthirduser = r

Save the file and restart the Web Server (Eg: # service httpd restart)


4. Create Repository Storage Area

It's not much, nothing too fancy. :) Create the directory you specified in the Apache configuration file, create the repo and change permission so that Apache could read it. Here we go,

# mkdir -p /var/www/svn
# cd /var/www/svn

# svnadmin create repos
# chown -R apache.apache repos

If things went fine this far now you'll be able to see an empty repo. Just start/restart the web server (# service httpd restart) and browse to the relevant URL (http://example.com/repos). If all is good you'll see a page saying Revision 0.


5. Set to Import Your Rails Project to Subversion Repository


**Note: In this post I'll go with creating proper Subversion repo with branch/tag/trunk layout. If you want to host more than one project in a repository, here is a good place to start with. (Which by the way is the source where much of the information in this post came from. I'm just compiling them into my experience) The above link will point you how to do a simple import of an existing project. However, I'm traveling the longer path because it gives us more flexibility in setting repository parameter which you will see in step 6.

I'll assume you already have a Rails project. Since Rails projects will have certain peculiar requirements, it'll be a great way to experiment Subversion. Let's assume your project is in your /bak directory and is called UberCool (directory name is ubercool).

First lets create a proper Subversion top level layout with
# svn mkdir --message="Setting project layout" file:///var/www/svn/repos/trunk file:///var/www/svn/repos/tags file:///var/www/svn/repos/branches

Then, checkout our repository to the project directory by
# svn checkout file:///var/www/svn/repos/trunk /bak/ubercool

Note: We are set to import here. We didn't add or commit any files yet. So if you are following this post (*not* the above mentioned guide from CentOS wiki), please proceed to step 6. We are not done yet.


6. Tweak Your Subversion Repository

Although we are set to import the project in the step 5, we didn't actually import the project yet. What we are trying to do is set the special tweaks we need in the repository before we import the project. So here we go.

Change into the project directory. Then, add the files to the Subversion repo. This is not the same as committing. We are going to mark the files to be imported from the project directory (or technically commit in) to the repo.
# cd /bak/ubercool
# svn add . --force

The --force option is required because Subversion does not know yet that these file are part of the project. So we have to instruct it sternly to add the files. :)

6.1 Stop unnecessary files from being Version Controlled

As you might know there are files that we do not wish to be version controlled. In a Rails project most definitely you don't want to keep track of the history of log files. In most cases you don't want to version track images (and/or binary files such as JPEG, MPEG, FLV, etc. files). You can remove these files from version control. For example, here's a few things I'll do in a typical Rails project:

# svn revert log/*
# svn revert config/database.yml
# svn propset svn:ignore "*.log" log
# svn propset svn:ignore "database.yml" config
# svn propset svn:ignore "database.yml" config
# svn revert public/index.html
# rm public/index.html

The above will remove log files and database.yml from version controlling and will ignore future versions of those file. It will remove the public/index.html file from version controlling too.

Rails projects also usually require a tmp directory which is not available here. So let's create it and but keep it away from version controlling. The tmp directory is required usually. If you've used Mongrel, you know what I mean. :) Whatever else directories or files you need to be available, yet not be version controlled,...... create now and remove from version controlling. (Eg: database migration files)

# svn mkdir tmp db/migrate
# svn propset svn:ignore "*" tmp

6.2 Mark Executable Files

There are files that you'd want to keep as executable files. If you are not sure let's consider a situation where I've been. Skip the background story if you don't want to hear it.

[Start of Side Story]
At my last work place we had a project which refused to obey when we call $ rake db:fixtures:load We knew this error was introduced only after act_as_ferret was introduced to the application. We were able to confirm it when we noticed fixtures without and ferret index could be loaded successfully.

After some work we trace the cause to ferret_server. In fact the ferret_server script failed to run. I know some of you might not have know about such a server was associated with act_as_ferret. But it is. The script should run if act_as_ferret were to function.

So what was the issue? It was the common Unix execution issue. Simply the script/ferret_server did not have execution permission on the deployment. The initiation point was, the ferret plugin being committed from a Windows box where you don't have a notion about execution permission. But even if we committed from a Linux box it would not make any difference for us. Our Subversion repository did not have the execution notion either.

Ultimately I had to solve the issue by modifying our deployment (Capistrano) scripts. Ideally I should have made the Subversion repo execution permission aware, but I didn't have that luxury due to the fact that our particular Subversion hosting was provided by a third party. However you can have that convenience and mark necessary files as executable in your repo. Read ahead.
[End of Side Story]


So let's make sure that contents in the scripts directory (and other required) of our Rails project are executable files. Please note that this works only on files. You cannot set execution property for directories in Subversion.

# svn propset svn:executable script/performance/* script/process/* script/* public/dispatch.*

Or, a better Unix-y way:

# svn propset svn:executable `find ./script -type f | grep -v '.svn'` public/dispatch.*


This was all I was felt was necessary, but browsing the web I found one more important tweak. Since all deployment hosts doesn't handle line ending in the same way (Eg: Unix & Windows) it's best to allow the OS to choose the line ending of the dispatchers.

# svn propset svn:eol-style native public/dispatch.*

According to the above, we just set Unix users will have LF line ending while Windows users will get a CRLF ending. Don't worry about it much if you don't know what I say about line endings. :)


7. Commit Your Hard Work and Sigh

Now you have completed all the hard labor of creating and tweaking a Subversion setup. Let's commit the work we've been doing now.

# svn commit --message="Initial commit, worked hard for this,.. really"


There you go. I know that's some work, but it's not too hard. After all we admins are paid to do these work. :)

Now you can let out a sigh of relief and enjoy your shiny new Subversion repository and work on your Uber Cool project. You can checkout your new repo using any Subversion (SVN) client with the URL of http://example.com/repos/ubercool/trunk

Eg: $ svn co http://example.com/repos/ubercool/trunk ubercool



That's all for now. This post (rather long) was written in haste, so I beg your pardon for any mistake or poor quality you encounter. If you need more information about Subversion administration, don't forget to check the Subverion Book. I've referred to it all the time whenever I'm not allowed to use Git. :P


PS: If you do not believe that Git is so good,............. go fork yourself!
Read More...

Friday, July 11, 2008

Updated: A Simple Diagram on Distributed VCS (Hint: Git)

Leave a Comment
This post is an update of a post I made on 29 May. So if you like you can skip the whole post and just download the diagram (PDF/PNG). PDF version looks better.

This is a simple diagram to illustrate the use and the difference of a Distributed Version/Revision Controlling System (DVCS) opposed to a traditional/centralized VCS. The post will target a generic audience and will not include in detail technical information. It will rather be an introduction to the DVCS in general. As you already know that Git is my favorite VCS software and the diagram will also have some reference in that sence.

My purpose of drawing the initial diagram was to explain DVCS (namely Git) to a client. However I wanted to change a few things and here is the results. I didn't think that this is that important to create a new version. But I wasnted to try out OpenOffice.org Draw as a diagramming tool. The older version was done in Dia. After trying OO.o Draw, I think I'm going to stick with it for most of my diagramming needs. :) Still Dia is another great tool to use. So here is the documet which came out from my OO.o Draw experiement.

If you are interested in learning more about DVCS here is a few links to get started.


Without further delay here is the diagram. You may want to download the (PDF/PNG) file for better viewing.


Here is the original post for your convinience:

-- Start of the Original Post --

I'm busy these days everybody. Lot of Rails applications to be deployed on Linux hosts with Thin and also Passenger. :) I know, I know I'm delaying the Ruby hosting post again.

Anyway, I don't have much time right now for a full post. Meanwhile enjoy this diagram about Distributed version controlling systems, especially Git. This is intended as an answer to the people who are new to the Distributes VCS and try to understand why everyone is leaving Subversion to jump into the Git bandwagon. :) If you need a wee bit more info about Git, from me, have a look in my old blog post here.

So here it is. Enjoy it while I get back. And.. oh, you might want to view the image in actual size to read it's content. :)

Image Link: Diagram

Update: Ok, this is the second time Scott Chacon beat me to do something and did it way better than I could've done. What to fret? The guy's awesome. :) If you want to have a better view of how Git works I'll beg you to watch Scotts Git video or at least Randal Schwartz's speach at Google tech Talks.

-- End of the Original Post --
Read More...

Saturday, May 03, 2008

Errno::EPIPE (Broken pipe) MySQL Error in Rails

4 comments
I've been working professionally with Ruby on Rails for a few months. To be exact that's mostly in SysAdmin capacity. During the time, I've seen some weired errors which I had not seen anywhere else. Time rolled on and now those things don't look weired at all. Actually I should have looked more carefully. Later, I did and found my way through. So here's some stuff I found. Hope this will save someones time.

(My Ruby servers post is coming shortly,.. really, and will include details about Thin and Passenger too. Actually I was waiting Phusion Passenger AKA mod_rails to be released. For a quick peak of the post, I'm currently running Thin in production and also evaluating Passenger.

Update: I've moved about 6 apps to Passenger. So far so good. Thin is still my first choice though. You can look forward to the post along with some Capistrano scripts too, ...soon. :)


1. Errno::EPIPE (Broken pipe - The Major Pain in the Neck)

The team I'm working with are using MySQL extensively. I'm glad they opted for an Open Source DBMS rather than doing what a lot of Sri Lankan IT firms do (i.e. Running the Unity Plaza edition of MS SQL Server). Since Sun's considering close sourcing parts of MySQL, I'll be on their nerve pushing toward PostgreSQL. I've always preferred Postgres (which turns 20 years old from the roots by next year) and even Sun is saying that PostgreSQL is the most advanced Open Source DBMS. Now that I mentioned it, expect the removal of that web page soon. :) Back to the topic.

We had a testing server which crashed overnight. It was a Linux (CentOS 5) installation. So I was pretty much sure that it wasn't about the OS. It worked as usual when we start the web app. Kept working fine. But when we return the next day morning,....... web app is not working, displaying an "Application Error" page.

Day one : I ignored it. Don't blame me. I had other servers to manage, being the only Linux admin might be a privilege, but not always. After all, the server was running an application going through heavy development. In fact, it had several known application errors. I might have not even read the logs.

Day two, day three, day four,....... Ok, there's something wrong. So I started digging through app logs. There it was, a broken pipe (literally).

The error read as Errno::EPIPE (Broken pipe). Quick Googling showed me that it was something reported before, even in pre-Mongrel era of Rails. At this point I was moving to Thin as my preferred Rails backend. So I mailed the friendly Thin Google Group. I will not go into the detailed discussion here. Anyone interested can see it in the above link.

So this is the problem. I had help figuring out what it actually was.
  • The error was occurring due to 'something' in MySQL driver
  • The actual error was the termination of DB connection of the app, due to inactivity

After discussing with Thin group and checking a lot of web pages, these were the only sulutions which seemed solutions. Wich means, I'm going to omit the parts where it was adviced to paint your face with salamander blood in a full-moon night and dance around a parking lot.

  • Set ActiveRecord::Base.verification_timeout = 14400 in config/environment.rb or to any value that is lower than the MySQL server's interactive_timeout setting. Or,
  • Create a sleeper thread which would use the DB connection periodically
Eg:
Thread.new do
loop do
sleep(30*60)
logger.fatal("ActiveRecord::Base.verify_active_connections!")
ActiveRecord::Base.verify_active_connections!
ActiveRecord::Base.connection.select_value('select 1')
end
end


I tried both. Sometimes they seemed to succeed, but the crashing was not completely eradicated. I was getting really frustrated. There seems to be no other pragmatic solution, and people were starting to doubt whether Rails was enterprise ready. I desperately had to do something. So I went back to the basics and started working up. This is when I remembered that there are two MySQL drivers for Ruby.
This is the point I recalled installing the Ruby/MySQL since a part of the application requited to access a native MySQL driver. And certainly the error generated from mysql.rb. There, I had a break. So as the next natural step I removed the Ruby/MySQL and installed MySQL/Ruby. Although both the drivers were maintained by the same person, I had hope for a fix in the C driver.

Removing Ruby/MySQL proved to be as simple as deleting the mysql.rb from the installation location. Installing MySQL/Ruby was a little tricky. The site listed a two step build process. But the first step had 3 different alternate versions.
  • % ruby extconf.rb
  • % ruby extconf.rb --with-mysql-dir=/usr/local/mysql
  • % ruby extconf.rb --with-mysql-config

The step worked for me was
  • % ruby extconf.rb --with-mysql-config
Then,
  • % make


At this point I could make sure it worked by running the compiled thing like:
  • % ruby ./test.rb -- [hostname [user [passwd [dbname [port [socket [flag]]]]]]]

Then finally,
  • % sudo make install

That's it. It solved my problem. It's been over several weeks now and the application is running fine with the new MySQL driver. I know this is not a proper solution for the problem. But so far it proved to be better than anything found on the Internet for me. I hope someone else will also find this useful.



2. Proxy Errors

This second error is not related to MySQL at all, but I'll just mention it. It's more of a blunder from my side rather than an actual error.

The same application started giving out proxy errors. It wasn't all of a sudden. I've seen that error when one or more Mongrel instances in a Mongrel Cluster died. So I just restarted the whole Mongrel Cluster and informed the developers. This was the peak of annoyance of that MySQL error. So we were more concerned about that.

But the issue turned out to be a severe pain than I hoped. When developers complained about constant proxy errors, I knew I had to go back to logs. However without much delay I figured out where I've done the mistake. Since MySQL issue was solved, my mind was more relaxed to notice the stupid mistake I've done.

Earlier my mongrel instances were running from port 8000 to 8003. So my apache proxy/proxy_balancer configuration looked more like this:
BalancerMember http://127.0.0.1:8000
BalancerMember http://127.0.0.1:8001
BalancerMember http://127.0.0.1:8002
BalancerMember http://127.0.0.1:8003

But at one point of the configuration and tuning, I thought it would make more sense to run mongrels from port 8001 to port 8004. I actually went ahead with that and reconfigured the mongrel cluster so that it was running on ports 8001-8004. During that time I had tested both Nginx and Apache back and forth on the same server. So the web server configurations were being changed all the time. Eventually this ended up in my above shown mod_proxy/mod_proxy_balancer configuration.

It was a funny situation. Apache was looking for ports 8000-8003 where Mongrels were serving ports 8001-8004. Which resulted in Mongrel instance on 8004 being unused and Apache forwarding requests to a port (8000) where nothing was running. That is why the proxy error was regular and consistant. :) Fortunately, I found this before someone else did and saved myself from the ridicule.

So the next time when you get a proxy error which seems rcurring and cinsistant don't forget to check your backend configs (Eg: Mongrel, Thin, Ebb, etc.) Vs the web server proxy configurations (Eg: Apache, Nginx).
Read More...

Wednesday, March 19, 2008

Arthur C. Clarke: The Odyssey Concludes

3 comments
As most you have already heard, Sir Arthur C. Clarke, the British/Sri Lankan writer and visionary passed way this morning in Apollo Hospital. Clarke who was a house hold name in Sri Lanka was living in the country from 1956. It is said that the souther sea is what brought Clarke here. It is a know fact that he really liked the places like Unawatuna, Roomassala, etc.

Clarke was a worldwide known person for his famous work like the Odyssey series (2001, 2010, 2061, 3001) and Rendezvous with Rama, Fountains of Paradise, The Deep Range, etc. and also for inspirational visionary work. For me, "The Deep Range" is the favourite, not at any rate because of the Sri Lankan connection in the book, but because of the illustration of Ocean and creatures it creates. I'll always keep re-reading it to experience that wonderful feeling of being in the Ocean. This is also the book where Clarke points that we could look into the sea before going to walk among the stars.

Sir Clarkes passing away remarks the end of the reign of "The Big Three" (Issac Asimov, Arthur C. Clarke and Robert A. Heinlein) of science fiction. I'll not get into the details of his contributions to the world, Sci-Fi, non-fiction or other. For a start you can read the Wikipedia page I linked above. I just wanted to say that he and his work was inspirational for a number of generations and it will remain so.

Here are some links I found on the Net reporting his death:
Slashdot: Arthur C. Clarke Is Dead At 90
The Associate Press: Writer Arthur C. Clarke Dies at 90
Washington Post: Arthur C. Clarke; Sci-Fi Writer Foresaw Mankind's Possibilities
BBC: Writer Arthur C Clarke dies at 90
LA Times: The passing of a legend: Arthur C. Clarke
National Post: Sci-fi giant Arthur C. Clarke dead at 90
Bloomberg: Arthur C. Clarke, Author of `2001: A Space Odyssey,' Dies at 90


With generations of people whos been inspired by his work, the Clarke mark will be visible in future than it did in past. Whenever we look into the stars, whenever we unravel the mysteries of the deep see, whenever we take steps as a race and whenever a LOLCat meows about stars :) we'll remember him.

PS: A side note. When people in US and similar areas heard that Clarke died on Wednesday, it was still Tuesday for them. So as one person in Slashdot said "He even died tomorrow". Well, he lived the future, and died in future.
Read More...

Tuesday, March 18, 2008

Google Summer of Code 2008 Mentor Organization List Announced

2 comments
Summer is going to be upon us very soon and it looks certainly awesome. As usual Google is brightening it up. Google Summer of Code or more lovingly called GSoC or SoC has been announced for the fourth consecutive run! (Meanwhile let's hope lives would be better for the people affected by the forces of nature in past few days.)

Google Summer of Code is a student program where university (BSc, MSc, etc.) students can work for an Open Source software project for 3 months under Google sponsorship. I was one of the lucky students last year (but had to resign due to a couple of domestic bereavements). This years program is announced and I hope at least a few excellent coders will be interested in this news. Please convey this news to relevant students, while I check out any interesting project for me. :) And remember last year about a 20 add students were selected from Sri Lanka. So don't think you don't have any chance. You just have to be enthusiastic, skilled (in coding) and could allocate time (which I know by experience is important).

Google Summer of Code is one of the best ways to get industry level experience, make connections with a lot of important programmers and projects in the world, contribute to Open Source projects where thousands of people will be using your code. To top it off Google is paying each selected student a stipend of $4,500 which is over LKR4.5 lakhs.

Google Summer of Code, running for the fourth consecutive year is now announced and will accept student application between 24-31 March. Any interested student can see the list of mentoring organizations and their suggested ideas in http://code.google.com/soc/2008
Actually, those ideas are just suggestions and students are welcome to propose their own ideas.

This year 175 Open Source projects are there for the students to work with. Students will find familiar names among them like Apache, Google, Nmap, Fedora, Debian, GCC, PostgreSQL, MySQL, PHP, Git :) , Subversion, Pidgin, Adium, Python, Django, GIMP, OpenMoko, GNOME, KDE, Vim, FFMpeg, VideoLAN, Samba, Sahana, WordPress, Pentaho, OLPC. etc. Please note that this is not about developing applications using those, but developing those themselves. GSoC selection is a very competitive process and good skills in development is required.

The SoC FAQ
will answer your burning questions about the particulars like how to apply, eligibility criteria and etc. So please get SoC to the attention and encourage students with strong programming skills to apply.
Read More...

Wednesday, March 12, 2008

No! I Will NOT contribute 5 Cents to that Sick Child!

3 comments
I'm sick of Chain e Mails.

I'm so sick of emails some people send me (oh, how much I feel like calling them offensive and degrading names!) saying that "Please forward this e mail if you have a heart" or "Please forward this message to everyone you know" or "Please help to save a life" and might contain a poem named "Slow Dance" or something. They claim they are so authentic (yeah, and I'm Superman in disguise). It might be something else like "forward this 100 time and it will bring you fortune. This is REAL! This worked for me" or "Fwd:FWD: Worst Virus Ever: Be careful !!!". The last one I got was named "Fwd: Pleeeeease forward ...it costs nothing 2 u.". Costs nothing?!!! It costs my patience, my nerves, my bandwidth, my time and sometime my sense also when I notice the sender is sometimes an IT student!

I won't mind promoting a cause which I know of and I could verify, like the fund raiser to support a university student who was dying from a cancer. But all the other things which no one can verify the validity (which is the common case) are not going to pass though me.

To make it clear I'll just say: "Those mails are hoaxes, so stop spamming!". Give some thought on these points. Pleeeeeeeeeeeeeese! ;)


1. E mail systems were never designed to track E mails.

E mail which is explained in gory technical details in RFCs, 2822, 2045 through 2049 were never designed with a facility to track the emails sent. Later facilities were added to support delivery receipts and return receipts. But this is just for the servers. This means that as far as I know Microsoft, AOL or any other (unless of course they are employing Hana Gitelman) *cannot* track any emails you send.

So our genius Spammer friends out there will next time tell you that your e mail provider is donating this project, instead of AOL. You are warned. If you are convinced that it is the case, please re-read the privacy statement of your e mail service provider and contact them if necessary.


2. There's no reliable way I know of which can track data across the Internet

Internet had a vast number of server interconnected in some means. I guess all of you remember that basic introduction. So,... when you send a mail, that piece of data travels thought this forest of servers, through numerous nodes and so on. Some servers strip data from that, some add to that. So don't expect that there's a reliable technique to track e mail. So what ever some mails claim, it is not effective up to my knowledge.


3. Some of the things are known Hoax mails/chain mails

I know most of you don't care, but some of these cases are well documented on the Internet/WWW as known junk. Next time when you receive such a mail, do a little background search and see for your self.

Next time you feel like forwarding a mail about Amy Bruce, Craig Sheldon or whoever it is, Stop! Most of the people who forward do not even think of checking a little about these claims. Most organizations have warning pages about chain mails. For example Make-A-Wish has a page saying that Amy (Amrita, etc., etc.) Bruce chain mail is a known hoax.

4. You are helping Spammers!

I once got a chain mail which had over 2000 e mail addresses in it. Yes! over 2000. Some were wide open to read. Some were hidden in the header. However, should a spammer get his/her hand on that mail (which I'm pretty sure of) he/she will have another 2000+ e mail adresses to their database. Just imagine the number of e mail addresses passing in the Internet in pure text format!

It is believed that most of these chain mails originate from professional spammers. :) And I've seen some locally originated stuff too. Whatever the cause may be, it'll still be for spammers advantage.


5. You are offending/harming others

By including your friends e mail addresses in a spam mail list, you are doing a harm to them. Then you are also affecting their productivity, time, bandwidth, etc. You also expose them to phishing and other forms of scams, and also could carry malware like trojans, worms, spyware, etc. If you don't listen to any other reason, at least stop because of this security concern. Need I say anything more? You know the pain for yourself.


6. You are harming yourself

By continuously sending chain mails, you are not only jeopardizing your friends, but also damaging your credibility and reputation. Your email address will shortly enter blacklists and it would be very hard to earn the credibility. For example there are guys that I completely ignore when I check mails or better, completely filter out. Me doing this might not hurt you, but it will when Google does that.



Some of these mails will tell a heart touching story of a sick child, sick husband or someone like that and mention something like "AOL and Microsoft will track this e mail and contribute 5 cents per every message you forward". Else it might me like "Dialog is offering brand new free Nokia phones, forward this 100 times to qualify for a N95 draw." Sounds familiar? It should be. I doubt some of you have even been forwarding such mails. If so please give up in the name of real people suffering from your spam.

I know most (if not all) of the people who send me these kind of emails have a good intention of supporting an innocent soul. But I'm so sorry to tell you that there are jerks you are feeding upon your good human soul. So please stop, now. You are not helping anyone. No you are not donating anything when you forward those mail. For make the point clear, no one is tracking how many times you send that e mail. So stop hitting forward button in vain.

If you are not ready to take my word for it try Googling to be ashamed of continuing chain mails. For those who are too lazy to click on that link, I'll list a few things you should see.

Your mails will not be tracked, will not be counted, will not help, will not bring good luck, will not win you prices, will not do any good.

Drop spamming your friends today!
Read More...

Tuesday, March 11, 2008

Howto setup a MySQL Connector/J 5.1 for Tomcat on Linux

5 comments
Again, I'm not switching to Java. :) For clarity, I'm helping one of my online buddies to setup and use Ruby even as I write this. This work was something I had to do for a Rails project which used JSPs and stuff with a MySQL database over JDBC. The application setup was quite interesting calling JSPs to work with a Rails webapp.

Actually the following things are found on the Internet. I cannot remember all the sources I looked at, but one was the MySQLs own documentation and Apache Tomcat documentations. So if this works (which in my case did), credit should not be mine. :)

Here's the setup.
  • GNU/Linux (in my case CentOS 5, although should work with any Linux distro)
  • Apache Tomcat (5.5.25, should work with Tomcat 5.5 range)
  • Sun JDK (1.6.0_04)
  • MySQL (5.0.22)
  • MySQL Connector/J (5.1)

1. I assume that Java is setup (See my previous post for more details on setting up Java manually), and your MySQL is running on the same host on port 3306. Please replace your actual settings if they are different.

2. First, lets set up Apache Tomcat 5.5.
If you already have Tomcat up and running, feel free to skip to step 2.
You can download it from (http://tomcat.apache.org/). For my case I downloaded Apache Tomcat version 5.5.25 (Eg: apache-tomcat-5.5.25.tar.gz).

Extract the downloaded archive to get the tomcat (Eg: tar xzvf apache-tomcat-5.5.25.tar.gz). This will give a directory with a name similar to apache-tomcat-5.5.25.

Move this directory to a place where you'd run it as the Tomcat server.
Eg: cp -R ~/apache-tomcat-5.5.25 /var/tomcat

Now set the variables, CATALINA_HOME, JAVA_HOME, JDK_HOME. One way of doing this is by adding those to /etc/profile file (Eg: sudo vi /etc/profile).
Eg: Add these lines to the end of /etc/profile:
CATALINA_HOME=/var/tomcat
JAVA_HOME=/usr/java/jdk1.6.0_04
JDK_HOME=/usr/java/jdk1.6.0_04
export CATALINA_HOME JAVA_HOME JDK_HOME

To make sure it takes effect, you can log out and log in, or just close the terminal and start a new thing.

Then you can start the Tomcat server by running the startup.sh script which comes with Tomcat. Shutdown script is called shutdown.sh. They are in the bin directory of your Tomcat directory.
Eg: $/var/tomcat/bin/startup.sh


2. Now lets move to Connector/J setup. Download MySQL Connector/J from (http://www.mysql.com/products/connector/j/). In my case I downloaded version 5.1. You'll get a .tar.gz file
Eg: mysql-connector-java-5.1.5.tar.gz

This writeup assumes this connector version. It is the latest as of this writing, but if the version differs, the following configuration instruction may not work.

3. Extract the connector.tar.gz archive to get a .jar file, which is the actual connector.
Eg: tar xzvf mysql-connector-java-5.1.5.tar.gz

This will most probably create a directory with a name something like mysql-connector-java-5.1.5. In that you'll find a directory structure where the .jar file (Eg: mysql-connector-java-5.1.5-bin.jar) will be in the topmost level.

4. Copy the .jar file (Eg: mysql-connector-java-5.1.5-bin.jar) to your $CATALINA_HOME/common/lib (i.e: common/lib directory within your Tomcat directory.)
Eg: sudo cp ~/mysql-connector-java-5.1.5/mysql-connector-java-5.1.5-bin.jar /var/tomcat/common/lib/

5. Create a context configuration file for Tomcat.
Create a configuration file in your $CATALINA_HOME/conf/ which has the file name apps-yourapp.xml
Eg: If your applications name is myapp then,
$ vi /var/tomcat/conf/apps-myapp.xml

6. Enter the things found here or in this file. When you copy and paste these code, please remember to replace values for Context path, docBase, Resource name, ResourceParams name, username, password and url with your values.

Eg: If your application is located under /var/tomcat/webapps/myapp and it's accesible via http://yourdomain.tld/myapp then your Context path would be "/myapp" (where it's accessible in URL) and docBase would be "webapps/myapp" (where it's available on file system).

7. Restart your Tomcat can now you are good to go. In your Java code you can use the JDBC connection in something like:
Connection conn = DriverManager.getConnection("jdbc:mysql://yourdomain.tld/my_database?" + "user=myuser&password=mypassword");
Read More...

Monday, March 10, 2008

Howto Setup Sun Java on Linux Manually

3 comments
I'm not switching to Java. :) But this particular thing is something I've answered several times. So instead of repeating it every time, I'll put the details here and point others here. Installing Sun's Java (JDK and JRE) manually, without using something like yum, apr-get, urpm, etc. seems to be something many desktop (or whatever) Linux users want. For example they want to run certain applications like NetBeans, FrostWire, etc. which require a JRE to be available. This tutorial is going to take you through the steps you need to setup a working JRE/JDK on a Linux system. You may as well take this idea and tune it for your non-Linux environments like Solaris, *BSD.

Note: This howto assumes you have 'sudo' configured for you. If you don't have 'sudo' configured, you'll have to either use 'su' or login as root to use the commands I have provided with 'sudo' at the beginning. Please replace file names and paths in this howto with your own values.

1. First download the JRE/JDK from Sun (http://java.sun.com/javase/downloads/) For the rest of the tutorial I'll use JRE 6 Update 2, but even if you want to setup JDK it's still the same steps. The selection of the package is your choice. Remember, the .bin (not -rpm.bin) file can be used on virtually any Linux distribution, given they have fairly up to date components and correct architecture (Eg: x86, x86_64, etc.).

You can get the JRE as a Linux self extracting file (.bin) or a RPM archive (-rpm.bin) file at the moment. Even if they change these things, I think they'll still provide the archive (not the package).
Eg: jre-6u2-linux-i586.bin

2. After you complete your download, go to the directory you have it on disk.
Eg: $ cd ~/skyeye/Desktop/ if you have it on your desktop

3. Change the permissions so that you can execute the file
Eg: $ chmod a+x ./jre-6u2-linux-i586.bin

4. Execute it
Eg: $ ./jre-6u2-linux-i586.bin

5. You'll have to go through the Sun's license notice and accept it to use the JRE/JDK. After this the package will extract and you will have a directory
Eg: ./jre1.6.0_02 if/ you downloaded the .bin file, or you'll have a jre1.6.0_02-i586.rpm file if you downloaded the -rpm.bin file.

6. If you now have a .rpm file, you only have use rpm (Eg: rpm -U) or a relevant utility (Eg: yum localinstall)
Eg: $ sudo rpm -Uvh jre1.6.0_02-i586.rpm

You are done. Contgratz!

But if you downloaded the .bin in the first place, please proceed from step 7.

7. Move your new JRE/JDK directory where people can access
Eg: $ sudo mv ./jre1.6.0_02 /opt/

8. Add the path to the bin directory within your JRE/JDK directory, to your systems $PATH. Use your preferred editor to edit the relevant configuration file to set the PATH persistently. If you are going to add it to the system wide PATH so that anyone can get it, edit /etc/profile. If you just want it for you, edit your .bashrc (~/.bashrc). As you may have guessed this is for the setups using GNU Bash. If you use a different shell, the lines in step 9 might need to be changed. Luckily almost all mainstream Linux distributions use Bash as the default shell.
Eg: $ sudo vi /etc/profile

9. Add the following lines after everything (replace /opt/jre1.6.0_02/bin with your JRE bin path)

PATH=/opt/jre1.6.0_02/bin:$PATH
export PATH

save (write to) the file and exit. Now, next time you log in, you'll have a Sun JRE ready for you. What step 9 exactly does is, add the path of the Java bin directory to the system PATH variable at the beginning.

That's all. See, not that difficult. But of course, if you are using modern distributions like Ubuntu, Fedora, Debian, etc. you don't have to go through all these. Sun Java packages are usually available through the repositories. Just use your favorite package management software (Eg: Synaptic, YumEx, apt-get, yum, etc.) to install it.
Read More...

Thursday, February 28, 2008

Off to the Hospital

Leave a Comment
Well, not right away. My health hasn't been good at all from this 17th. I caught what is suspected as a viral fever on 16th night I guess. Anyway, I couldn't get up on 17th Sunday.

I took medication, took a medical leave on Monday, and got back to work on Wednesday. I wasn't feeling best, but I could go on. I thought my weariness was just aftereffects of the fever. In the weekend I made it to my hometown, and just measured my temperature casually at my parents demand.

Bam! I had over 99.2 F (for those who don't know 98.6 F is the normal for an adult). I went to see a doctor and took medication. I thought it was over.

After that day, I noticed that I have fever in the afternoon, daily. So I met a specialist two days ago, who had me do some tests, which turned out to be OK. But she warned me, if I have temperature in two days, I have to admit.

Well, two days are passed, and I still have fever. So I'm getting ready to go in early tomorrow morning. :)

I've not been idle last week or so. I've worked remotely for office, signed up for new cool things like GitHub and Heroku, but more details have to wait. I think I have to wait a little more before I work on my blogposts I promised,enjoy the post about OpenMoko meanwhile. I guess I have to conclude this post like this, and I'm afraid, it looks like that I have to stay couple of days offline.

Meanwhile everyone, take care. I'll be totally fine, I'm with my parents & brother. Nadee is also coming with her mom. So don't worry about me, just look after the Net until I return.
Read More...

Monday, February 25, 2008

Open Source Revolution: Episode N - Attack of the Phones

5 comments
Once upon a time in a galaxy far far away called the Milky Way, there were phones! Some of the creatures there, were smitten by the iPhone or Nokia N Series or so called smart phones which are jam packed with features. And yet for some of us the whole requirement would be a Nokia 1100 or a Motorola F3 (or whatever brick which can make phone calls). Whatever the cause maybe, you should be well aware that the arena of mobile phones is expanding it's horizons like a frenzy. And being a FOSS geek, I can't help feeling optimistic about the current state of the art.

This far the mobile phone has been a gadget we buy from a vendor, use it as it is, not allowed to meddle with, preloaded with proprietary software (which usually is ugly literally and metaphorically) and so on. In regions like US, EU and so most of the phones come bound to an operator (so called locked phoned). Fortunately in Sri Lanka it has not been that ugly where we buy unlocked phones then put a SIM from any GSM operator available here. So freedom of the phone may seem not so affected to some people. But we all know even in the current light of the things, we are bound to the gadgets (hardware) and pre-loaded software.

On hardware side we are we have been using custom designs from OEMs which are relabeled by phone vendors and some times even re-branded by operators. These hardware platforms are usually not available to the public and have never been based on technological commodities. Software is even worse. While some use custom built software, smart phone and PDA, etc. market has been dominated by few closed box, secret things like Symbian, Windows Mobile and Palm. May be that can be a reason why I've always been disappointed by the feature/price ratio of mobile phones. You may want to disagree, but I feel as usual, a limited number of players in the game can kill innovation and improvement. How many areas we have seen this happen? Isn't this one of the most compelling reasons why people accepted Open Source movement with open arms, that they we tired of the cartel?

However, in recent years we've seen a developing trends in Open Source software based, namely Linux based phones. Some Asian vendors and even some bigger names like Motorola started using Linux on mobile phones. As the time moved on we could the that it was beneficial in maby aspects, ...just like the Software industry. This is the backdrop where OEMs start making 3G and even 3.5G Linux phones, big firms making consortium to get involved with the FOSS revolution in mobile phones. This is the also the backdrop where Google, who without a doubt a key innovator on the Net, made an somewhat unexpected move. People expects energetic moves from Google and they have indeed skyrocketed technologies before. For the people who's been staying underground at least for the past 5 years, just ask someone about Web Search, Gmail, Ajax, Blogs. Sure they didn't create any of those but they did elevated them in to heights which are now the measuring points.

After the success of Apples iPhone there was a rumor floating about a gPhone from Google. When people was anticipating a phone Google presented a platform: Android. Android is a mobile phone platform initiated by Google and supported by Open Handset Alliance. OHA consists of major names in the industry including Intel, NTT, HTT, Motorola, LG, Samsung, eBay, Marvell, Synaptics, Wind River, etc. Android is yet come out with something solid, but an SDK is out and people are already hacking. Phones based one Android are expected within 2008 from major vendors. However at this stage applications are limited to Java platform (not standard Java ME or SE) and access to low level device APIs are not available. There are some few other catches too. But I for one, welcome our new mobile platform :) and guess we can expect interesting things from Android. However my main expectation and main focus/inspiration of this post is not Android, but OpenMoko.

Although OpenMoko may sound like a funny name, it's catchy. Admit it, and the slogan is also a Matrix kind of "Free Your Phone". Without just working on words let me show you a couple of pictures. Ladies and gentlemen, I give you Neo 1973 (and Neo FreeRunner which is not in the pictures).

Well, for some of you, especially for FOSS geeks, this is no news at all. We've been expecting this great gadget for quite some time.

Some might go far enough even to say that it looks awfully similar to an iPhone. That is where I step in again to enlighten you, saying that it is awfully vice-versa.

This cool thing you see in the pictures is called Neo 1973 which is named after the year of mobile phone invention. Neo 1973 which was a developer oriented device is sold out in order to pave the way to it's spiritual successor Neo FreeRunner which I'm told by their sales team, will be available in mid April.

Neo device is the first available mobile phone using the OpenMoko platform. OpenMoko platform is quite similar to Android in it's aims, but is older and much much more open than Android. OpenMoko based phones, namely Neo devices are based on more commodity hardware. Even the CAD drawings for the Neo 1973 is open and CAD drawing for the FreeRunner is also expected.


Here's the hardware spec for the current (FreeRunner) device:
(Only 2.5G is supported for now, will be upgraded to 3G in future)
  • 120.7 x 62 x 18.5mm factor
  • 184g weight (unconfirmed yet)
  • 2.8" 640 x 480 VGA Color TFT LCD
  • Samsung 2442 SoC (400 MHz)
  • SMedia 3362 3D Graphics Accelerator
  • 128MB RAM
  • 256MB Flash
  • 2x 3D Accelerometers (wow, cool)
  • Tri-Band GPRS/GSM - 900 (850 for N. America)/1800/1900 MHz
  • WiFi 802.11b/g
  • Bluetooth 2.0 EDR
  • AGPS (Assisted GPS) receiver
  • USB 1.1
  • Micro CD slot
  • 2.5mm audio jack
  • Replaceable 1200mAh battery
  • Touchscreen (finger/stylus)
FreeeRunner (which will look strikingly similar to Neo 1973 in look) is operated with a full touch interface, both finger and stylus, supports motion sensing and gesture based operations (sweet, that's what accelerometers do). Neo devices also ship with quite a few goodies like a headset, pouch, lanyard, micro SD card, USB connector cables and an uber cool stylus which is also a pen/laser pointer/flash light (with extra batteries). Now isn't that appealing or what?

Neo FreeRunner prices are yet to be unveiled. Neo 1973 was sold at $399. The software platform is under development. In April, hopefully FreeRunner will make it to the hands of mobile developers, FOSS lovers, enthusiasts, etc. Then given the development and testing exposure, it will be ready for prime time. But I guess I'm looking forward to the April release rather than wait for the public release. :)

As going with the FOSS model, the complete stack of software, the OpenMoko software stack is 100% Open Source. Yes, not only the open API fantasy or the FOSS userland, but the whole thing including the OS (kernel, etc.), Java VM, etc. is Open Source. You are not hindered by a vendors SDK or an API. You are given the whole freedom to to whatever you like with it. Now that's what I call freedom on the phone.

So friends and comrades, let's flock on and walk together to Free the Phone.

Image courtesy: www.openmoko.com, www.linuxdevices.com
Read More...

Wednesday, February 13, 2008

I'll be Back on This Blog, Very Soon

1 comment
I know it's been a while since my last post, but I'm still around despite the rumour that I might have been in Fort Railway Station bomb blast. That's a joke, a very bad and a tasteless one, I know. Shouldn't joke about security (literally). So let me apologize from everyone and convey my condolence to the close ones of all who lost their lives and were affected by recent terrorist attacks in Sri Lanka.

Then let me give this brief note. I've been too busy past couple of weeks (actually months) with a lot of things including my new employment. So I wish to let you know that I'll be back very soon, and I already have a couple of things as drafts. So stay tuned. Good news is that I have loads and loads to say about things (Tech/Non-Tech), especially web servers (Apache, Nginx, Mongrel, Thin, etc.), Ruby, Rails, web hosting (sliced/VPS), rails deployment (Capistrano, etc.), GNU/Linux, Open Source (as usual), Open Source mobile phones (OpenMoko, Android), rants about chain letters, etc., etc...... and why, a few things more about Git. (by which some people call me these days, with my blessing too :)

I even have info about couple of interesting people I talked with lately. One is an Italian Ruby geek and one is a most talented and popular musician in SL. Some long term readers might notice that I'm starting a new blog too (Shout out: Thanks Bo for the inspiration and kind words. My poetry and Sinhala blogging is coming at last), which is going to be bi-lingual and will be about non-geeky things. And I've got twittered too. :)

So thanks everyone for the support so far. It was great to know how many people were actually following my Blog, all because I couldn't blog in a while. :) You can expect a lot of update here after this weekend, after I've finished with some major hauls.

And oh...., Happy Valentines! in advance,.... that's if you are celebrating, of course (unlike me and Nadee). Sorry that I can't try to be funny as usual since I'm busy right now. Even this was squeezed out because a meeting was cancelled. :)
Read More...