Tuesday, September 2, 2008

Chrome

I posted some cynicism, speculation and first gut reactions when I heard about Chrome earlier today. 

Well I fired up one of my old XP pc's that doesn't get too much action these days and downloaded Chrome. This post is being made on Chrome.

My first 10 minute reactions...

Quick download and install.
  • Nice minimalistic feel.
  • Fast rendering, on a clunky old pc that normally feels pretty slow. Not a very technical benchmark, but call a spade a spade, it is much faster then IE7 on the same machine.
  • Good, modern HTML rendering and CSS support (ala webkit)
  • Open New Tab, shows a most visited sites, with a similar feel to Opera's speed dial.

More to come...once the Linux version is available.

Browser Wars2?

Is it a browser? Is it an OS? Its, its, its Super Google Browser.

http://googlesystem.blogspot.com/2008/09/google-os-is-actually-browser-google.html

Eventually I'll take more time to think about what this means for people like myself creating web based applications. For now, the first thing that comes to mind is please no Browser Wars 2. 

For the sake of all of us that waste so much time, effort and productivity trying to make compliant based html and css look good in each browser, the initial "oh shit another one to worry about" outweighs my initial "oh wow!". 

Some parting concerns...Google is about advertising. Google is about collecting data. A Google OS/Browser thing-a-ma-bob sure has the potential to know an awful lot about me. This is sure to feed fuel to the 'Google is evil', 'Google is big brother' conspiricy fire. 

And finally, a tongue-in-cheek thought...if Fedzilla, in the United States versus Microsoft case accused Microsoft of abusing monopoly power by bundling thier "OS" with their "Browser", and the big "G" now bundles thier "OS" with a "Browser" (Google OS Is Actually a Browser), and they already dominate the search market to boot...hmm

Thursday, August 28, 2008

You're blind ump

Two of my favorite things: systems, and baseball. Since I was a kid, I've always been fascinated to learn how things work; my mom didn't own too many kitchen gadgets that hadn't been taken apart by yours truly a time or two to understand the inner workings. Old rotary phones, TV's, radio's, record players, nothing was off limits for my research. Ah, the excitement and joy found during one particular experiment that involved getting the operator on the other end of a radio speaker that was jerry-rigged into a house hold phone line...good memories. As a foreshadow to the rest of this post, my first year at an engineering college even included a modeling and design class that had a 'high tech' project in modeling the feedback controls for a fictitious futuristic system to tell if a tennis ball was in bounds or not :-) Fun stuff.

Also, since a kid, I've loved the game of baseball. To this day, my favorite thing to listen to on the radio (yes some of us still enjoy listening to games on the radio) is a baseball game. The majority of the time I turn on the 'bube' is to catch part of a game. There is just something about baseball that I find so much more enjoyable then other sports. Some of my favorite memories of 'play time' with my kids involved seeing how many neighborhood kids we could fit in the minivan and going to the local field for a few hours. I'm not talking about parent hyped and sanctioned little league here, just pure unadulterated pick up baseball without enough people to even cover all the positions. The pick up games don't happen so much these days, but the radio and tv still bring the pros playing out my favorte pastime almost every night from spring to fall. 

Well today, two of my favorite things combine. You would think I'd be excited. But, I'm not. 

At least for me, these two have always been separate things. When I have enough technology for the day, baseball is a great past time. But the idea of mixing the two just feels wrong. To be fair, its already been happening for a while; pitching speed guns, big high tech scoreboards and 'jumbo tron' screens. But nothing to date that actually effects the real time outcome of todays game at the moment. Especially since the big screen to date usually just shows some silly kissing cam instead of replying controversial calls.

Today baseball starts using instant replay to tell if a home run is really a home run. Personally, I like the chubby middle age guy standing in the middle of the field or behind the plate waving his finger in the air.

Yes, even being a Mets fan and knowing what happened in Yankee stadium this year when even umpire Davidson said after the game “I f*cked up”. Yes, even after one of our own called for instant replay help early in the season in Miami. I still think its wrong.

Bad calls are part of baseball. I realize we are not talking strikes and balls here, but whats next? When I go to a ball game I expect bad calls. “You're blind ump, you're blind ump. You must be out of your mind ump”; its part of the fun. We fans have done it for years. Its like hot dogs, peanuts, cotton candy, and bringing your glove to the game for a foul ball, even if your seats are upper deck by the left field four pole.

I know the managers voted 25 to 5. I would guess most professional players would vote with the same percentage. Other fans I've talked to seem to be in similar percentages. But, I'm one fan saddened by today's changes to my favorite pastime.

Tuesday, June 17, 2008

Opera 9.5

Opera 9.5 is finally available! I've been running the beta version for a while and have been pretty happy. The final release has a little more polish and has so far been perfect. It feels like I've been running it for days already, but I think the official download was only like a day or two ago...what can I says, my days are very long right now as we get iZoca ready for our beta clients.

I've been an Opera fan for a couple of years. I think I started becoming more interested somewhere around the same time they decided $0 was a better price point that something crazy like $30. Opera is my browser of choice: consistent and good css support, performance, great tab support, speed dial, etc.

9.5 has all that we've come to know and love about Opera, but feels to do it all a little faster. There is additional support for some CSS level 2 specifications, and now also includes some CSS 3 specifications like text-shadow. How cool is that. Oh, well if you didn't see anything cool about that "text-shadow" thing, then you'll need to download Opera 9.5 and take a look again :-) You can get a complete look at the CSS 3 support over at the Opera 9.5 page.

One of the things that I still regularly go back to Firefox for is the great tools available for web developers. Opera is getting better at this though. I've been using their new set of developer tools in beta for a few months. Its code named Dragonfly, and is part of 9.5. There is still some catch up, but I like what I'm seeing and find myself opening Firefox when developing a little less then I used to.

I'm also a fan of the Opera email client, M2. It got a bit of an upgrade as part of 9.5 also. There is also an upgrade to the mail database as part of this install. The mail search features are amazingly fast and powerful. M2 uses a single indexed database approach, with filtered views for browsing through mails. This is little different to the traditional folder based approach that a lot of other mail clients use. At first you may ask yourself how can this mail be available in 3 different folders all at the same time. The short answer is that the folders are just "views" of the mail data. Its pretty powerful once you experience it and get accustomed to it.

So far, not single issue to report. The only real problem that I never found a workaround for while in beta was some flaky behavior when connecting to some public wi-fi hot spots that force a bunch of http redirection just to get started. That seems to be corrected 100% as of the official release.

Thursday, June 12, 2008

git, svn and our current dev workflow

At iZoca, we are currently working in a distributed development environment, using a hybrid workflow that utilizes both subversion and git. There are already quite a few blog post out there of how people are integrating git into their existing subversion based development workflow, and most of our approach has been learned from reading these.

However, we've had some small hurdles to deal with when it comes to freezing rails versions and plugins, and applying patches (either our own, or from the community) back into these plugins before they make it back into their respective master branches. We've also had some small hurdles just figuring out the workflow that feels the best for keeping rails versions and plugins up to date. So, I thought I'd share the approach that seems to be working best for us right now.

First to set the background. Like most people that have a hybrid subversion/git workflow, we are primarily using our subversion repository as our "staging" repository if you will for our deployment and as a push/pull like conduit for getting and keeping everyone's local git masters fresh.

We are following the common recipe of using git-svn as our bridge. This recipe is pretty well documented (a quick 'git svn google' will get you some examples) The recipe goes something like this:
cd my_rails_app

git checkout master <<== get yourself to master  

git svn rebase
<<== make sure master is up to date

git checkout -b my_new_feature_branch


test, code, fix, test, code, fix...

git commit -a -m "my new feature stuff" <<== maybe preceded by some individual git add x, git rm x

git checkout master <<==I think you got it now, we are switching back to the master branch

git svn rebase <<== get any newly committed changes to the shared master trunk
git checkout my_new_feature_stuff <<== switch back to the feature branch
git rebase master <<== attempt to apply our changes on to the latest trunk of code
fix conflicts if any

git checkout master

git merge my_new_feature_branch <<== bring our new feature into the master (maybe --squash if there were a number of individual commits to get this branch done and we want them to show as a single commit)
git svn rebase <<== make sure things are good
git svn dcommit <<== push our modified master up to subversion trunk. This is the basic day in day out workfow. Some of the assumptions being that "git master" and "svn/trunk" are analogous to one another. If you are working on code that is from an origin other then trunk (like subversion branch/Version_x_y_z) you will be working in a local git branch that you create by: git branch local-branch_x_y_z VERSION_x_y_z


The workflow above is basically the same with local-branch_x_y_z playing the part of master. Once your bug_fix_branch, small_feature_branch is merged back in, you probably want to keep things tidy by cleaning up when your done:

git branch -d my_new_feature_branch
So, this is all pretty well documented and seems to be the approach working best for most using a subversion / git hybrid approach. As I mentioned earlier, the hurdles we've had are usually related to plugins and vendor/rails versioning and sourcing.

One of the problems is that the common approach for those using a git only workflow is to use sub-modules for managing externally versioned and maintained sources. So, lets say you want to include the version of rails you are using within your project instead of relying on installed gems; freezing rails as we all know it. Well, one approach is to "git clone" the rails branch/master(edge) you want directly into your vendor directory. The problem here is that now within your project, you also have another complete git repository; remember, git clone gets the whole repository and history. So, you could always choose to just include a depth of 1, but you still got a repository within yours. Additionally, if you try to treat it as a git submodule, well then "git-svn dcommit" is going to have a problem the next time you rebase/dcommit. There are some published work around to the git-svn dcommit/rebase bug.

But, when I stepped back and looked at things with iZoca, I questioned if submodules were the right answer even if they worked with git-svn. The problem is that we want to be on the edge. We want to be active in the community. We want to contribute to open source, collaborate, and naturally harvest the benefits of what others are doing with open source. And, no matter how we slice it up, when we step back and look at it, submodules doesn't seem to be the answer for us (even if they worked with git-svn.) It seems like this approach works well if you want to freeze to a particular version, and at sometime in the future pull latest features, or checkout latest branch. But, for actively working in any of these projects, work feels more like monkey patching then it should...at least to us it did.

We are taking a bit more of a distributed development approach with these external projects. Rails, Rails plugins, javascript frameworks, etc. can simply be archived back into their respective locations in our core application by the respective developer that happens to be working with that source. They will maintain separate project folders outside of our core application that are clones of the source they work with. They can pull, branch, diff, patch at will in this project and collaborate with the community at large without interfering with iZoca proper. When changes from this work are ready to come into iZoca proper, the respective developer can simply:

git archive the respective project branch back into iZoca proper, run the test and call it a day.

From an iZoca perspective this "copied in" archive just becomes part of a local iZoca git branch changeset, and then merged back into master when ready. This lets the iZoca core branch stay a little less complicated.

The alternative to having submodules, with multiple branches all taking place within core branches at various versions all seems a bit more complicated then we want it to be, even if it worked with svn. Maybe I just don't comprehend submodules properly to begin with, but the approach we are taking now seems to be working well.

Not every developer will necessarily have a clone of each of the plugins, or even rails all of their own. It all depends on what they will be working on, contributing too, etc. We don't try to keep a centralized version of each of these separate repositories, because it kind of goes against the grain of distributed version control anyway. Depending on the scope of each of these projects, some may even be forks of github repos, but it isn't a requirement.

Rails prefers diff patches via Lighthouse, so a github Fork of Rails doesn't really seem necessary for this kind of work. But, one of our developers at some point in time might find that helpful and it doesn't really matter. The point is at some point in time we may need, want, desire to get their branch of plugin xyz into our core iZoca master to fix or enhance something. And when we do, the developer that needs, wants, desires the fix or pull of a recent change set can either collaborate with another developer that is familiar with that plugin or section of rails and get a diff patch from that person, or works on the that source themselves in their local branch of that respective project, and then when done simply archives the result of that patched plugin into a local branch of our core iZoca application, tests, then merges and eventually git-svn dcommits.

An example of the archive/copy...lets assume that I want to fix a bug in some_cool_plugin.
Well, I would either have or create a local git clone of some_cool_plugin:

git clone git://github.com/rails/rails.git
or if I already have the clone:
git checkout master

git pull
git branch my_fix_to_some_cool_plugin

test, code, fix, test, code, fix

git commit -a -m "patching bug with my cool hack"

git checkout master

git pull

git checkout my_fix_to_some_cool_plugin

git rebase master
git format-patch master --stdout > my-cool-patch-file.diff

#email patch or submit via lighthouse, or git push to forked github
#then for me to archive my latest patched version of the plugin back into our core application
#before the patch has been officially approved or committed back to the plugin/or rails master
#we can do something the following:
git archive --prefix=some_cool_plugin/ HEAD | (cd ~/scott/Projects/izoca/vendor/plugins && tar -x)

This gets the latest patch code, without including the .git repo, back into our core application branch to then be committed, merged and dcommitted just as normal.

Hope this approach helps somebody else, it seems to be working pretty good for us right now.

Friday, June 6, 2008

Opera spellcheck in OpenSolaris

Opera will use your gnu aspell install if you have one to implement spell checking. I seem to be having some problems with my aspell package install however.
So, "pfexec pkg install SUNWaspell SUNWaspell-en" seems to finish, but I don't see any aspell libraries installed anywhere. Hmmm. broken package maybe? I guess I don't know enough to suggest a broken package, I just don't see any results from installing this.

So, switch gears to blastwave:
pfexec pkg-get install aspell aspellen

now i have my aspell libraries under /opt/csw/libaspell*

So, lets link them under /usr/lib so that Opera and others can find them:
ln -s /opt/csw/libspell* /usr/lib/

Open Opera...ahh, we have spell check.

ls colors OpenSolaris terminal

The default gnome-terminal experience in the OpenSolaris install is lacking colors; at least for my personal preferences. But, I just alias my default ls to the gnu version in my bashrc and away we go...

add this to you .bashrc
alias ls='/usr/gnu/bin/ls --color=auto'

Thursday, June 5, 2008

An afternoon with OpenSolaris

Last year, right about this time, I blogged that I was keeping any eye on OpenSolaris. I was interested in investigating it a bit more, but didn't feel that the timing was right for me to jump into the mix yet. Mainly, because for a complete outsider like me, the path of entry seemed confusing. I've been holding my hand up since reading this comment from Ian Murdock's blog a year ago:
"How many of you would take Solaris for a spin if doing so was as easy as, say, downloading the latest version of Ubuntu and installing it?"

Well guess what? The time has come. This past weekend I downloaded the live cd release of OpenSolaris 2008.05. Stuck the cd in my Dell XPS M1530, rebooted, and I was feeling a whole lot of OpenSolaris love. The experience was wonderful! It was the same, 'wow, even a newbie like me can do this" feeling that I had the first time I gave an Ubuntu live cd a spin a few years back. That little Ubuntu spin a few years ago was so wonderful, that before I new it, I went from a Linux newbie, to a Linux only user. This latest spin of OpenSolaris has been every bit as rewardingand I have this feeling I'll be spinning OpenSolaris a bit more on my harddrive as time goes on.

Yesterday, I decided to format some space on my development laptop harddrive for an OpenSolaris partition. This machine is being used almost around the clock these days for development using Ruby, Ruby on Rails, and MySql, with a Rails pimped up version of gvim. I use Opera (9.5 beta to be exact) for most personal web browsing and mail, and Firefox for development plugins. So, my first question was "how long will it take me to get there". I don't have time to waste right now, but I really want to give this a spin. So, I backed up my Ubuntu Linux partition  the night before and I was ready to go.

What follows are my notes that I made along the way while going through my first afternoon on OpenSolaris.

Most of it is not well written, and I make no claims to its accuracy. But, I tried to capture each step as it took place. Naturally its a little more of a trial and error process when its really taking place then these final notes end up being. So, with out further delay, here are my notes from my first day with OpenSolaris...enjoy. I sure did!

Install was flawless. During the live cd run I was prompted with the available wifi connections, I picked mine entered my key, and then network automagic finished its magic. All flawlessly. Somehow these settings were pushed from my LiveCD experience into my install, because right after booting up for the first time after the install, I was told that I successfully acquired my ip address, and saw that my wifi connection was indeed up and running. Didn't touch a thing...sweet! Almost too sweet, though. I'll need to dig into this nwam thing a little and understand where its keeping my wep keys, and order preference etc. But hey, I'd rather be researching why its working instead of why its not working :-)

Also, right out of the box I had great looking fonts and NVIDA graphics support...hey this is looking pretty promising. I mean at this point we are just talking about the housekeeping tasks of getting up and running, we aren't even talking about any of the hardcore things that make OpenSolaris so attractive (real SysV based Unix, zones, zfs, dtrace, etc.) But even so, just talking out of the box housekeeping this is already looking pretty sweet and very promising.

Open nautilus, go to Network...bam the other windows computers in my house are all already sitting there via samba shares - absolutely nothing for me to do - sweet! I mentioned that I backed up everything that I wanted protected from my Ubuntu install over on a shared network drive in the house. Well that was on my FreeAgent Pro that I have attached to a Windows PC in the house. And now, right out of the box I can see my backup stuff over my wifi accessing a windows shared drive to my FreeAgnet Pro. Ah, did I mention that all I've done so far is boot up. Very cool.

Next, download Opera 9.5 beta (I've been running this under my Ubuntu install since the first beta came out months ago), then I "pkgadd" my downloaded opera package, copy over my .opera directory from the network drive into my new home directory, open Opera... and bam, all my mail and mail settings, bookmarks and Opera configurations. Very nice! Continue to poke around a little more in Opera and everything seems to be good. I'll need to take care of my plugins and probably symlink my opera plugin directory to my Firefox directory to share some plugins, but I'll let the plugin thing wait for later. More meat and potatoes stuff to tackle first.

Type ruby -v, got no ruby love. Ok, "pkg search -r ruby", find the "SUNWruby1.86" package name. Ok, "pkg install SUNruby1.86"... Ah, my ruby love is there. Lets add ruby to my path...open a new .bashrc (i don't see on here yet), add PATH=$PATH:/usr/ruby/1.8/bin: ; export PATH. Ok..Quick IRB session to poke around a little. Then off to take care of getting some Ruby Gems going.

OK, "gem environment" to see what we are working with. Looks like the Sun ruby package is using "/var/ruby/1.8/gem_home". OK, quickly peak just to see if there is any out of the box gem stuff there that might be part of the special Sun ruby package. I don't think so. gem_home looks empty and I think I'm starting fromscatch. Start going to town on a few of the more common gems that i need.

Oops. Small problem installing any gems that require c compile. Looks like path related stuff. Seems like it shouldn't be too bad to address, but you never know. Quick Google search...ah, its just an issue with the c compiler path location as explained here: http://blogs.sun.com/prashant/entry/dtrace_support1. Once I updated my rbconfig.rb file as suggested in this blog post all my gem installs went fine from there on out.

Next, lets check and see if we have any MySql support out of the box...nope. ... okd, lets see what is out there: "pkg search -r mysql". Ok, looks like some mysql 5 and mysql 4 packages and support packages. Lets take anything that might be needed to get mysql5 up and running. (remember, this is about getting going fast, not worrying at this point that maybe I'm loading an extra library or two that I don't need" pkg install SUNWmysql5 SUNWmysql-base SUNWsfwhea.

OK, packages are done. Lets try just executing mysql and see if anyone is home. Oops...error can't connect to local MySql...Ok, so we have mysql installed, but obviously its not running. Ok, lets see what we have: "svcs mysql" gives me "disabled svc:/application/database/mysql:version_50". Ok, looks like it just needs to be started: "svcadm enable svc:/application/database/mysql:version_50"

Ok, now that we are enabled lets change the root password: mysqladmin...oops, no path. Well I plan on using the mysql commands a lot, so lets add these to my path. Just as above add another path for ":/user/mysql/5.0/bin" Now, "mysqladmin -u root password "new pass" " Ok,  mysql is up, an default root pass is changed. Hmm...some intermittent network drops along the way. I didn't stop to note it the first few times, but it just happened again. Not sure if thats OpenSolaris, NWAM, my hardware, or some household interference causing problems. I found the following commands a bit helpful troubleshooting this: dladm show-wifi -p
  ==> print out current wifi data link info
dladm scan-wifi
  ==> show all available wifi links
svcadm restart nwam

  ==> restart network auto magic

I'm not sure what my wifi issues are yet, but they haven't been show stoppers. Generally, a svcadm restart nwam did trick. Once, I even went as far as disabling nwam, and configuring things manually through network-adm, but I went back to the nwam auto magic after the next reboot, and its been fine since.

I checked my Device Driver utility, and it looks like its using an Intel Pro/Wireless 4965 AG or AGN. I'll have to look and see if that is right. But, things are generally pretty good again now. I'll need to keep my eye on this and get back to it as time permits (or as the problem worsens - ha, which ever comes first that is)

The new OpenSolaris Image Package System (pkg install, etc.) is the way to go, but its young and growing and not everything you might need is in there yet. So, falling back to the blastwave seems like the way to go when necessary:
cd /tmp
wget http://www.blastwave.org/pkg_get.pkg
pkgadd -d pkg_get.pkg
export PATH=$PATH:/opt/csw/bin
pkg-get install

I use the full glib'd version of vim for most of my development work and the out of the box vim wasn't compiled with the -g option. Unfortunately, it doesn't look like there is a package yet in the IPS for gvim, so I turned to blastwave: pkg-get install gvim (then sit back and watch the billion and 1 glib, gtk, glade dependencies get loaded). Oh well, for me at the moment a small price to pay to get my favorite text editor up and running.

Ok we have gvim (just added the blastwave install location to my path to make the gvim command simple../opt/csw/bin). Copy over my old .vimrc and .vim directory...fire up gvim. Not too bad. Some problems with path issues for things in my .vimrc that either I'll pull out for now or comment until I can address. But, overall not bad. I have my Ruby and Ruby on Rails plugins as well as most everything else.

Peek to see if we have OpenOffice..looks like that is not out of the box default, ok, pkg install OpenOffice, wait for download, and install, and we are good to go with OpenOffice 2.4

So, that was yesterday. Since then I've been working full steam ahead on my current Ruby on Rails project. After taking these notes yesterday, I since connected to my svn repository over ssh with no problem. Checked out my project, and have been going to town. This is pretty amazing. I'm at a day and a half, and haven't had to boot a single time back to Ubuntu to get anything or do anything that I forgot along the way.

Well Ian, last year when you asked: "How many of you would take Solaris for a spin if doing so was as easy as, say, downloading the latest version of Ubuntu and installing it?" my hand immediately went up. Well it just came down, because I just did exactly that. Thanks to everyone involved. Hopefully my OpenSolaris fu will get better and I can give back to the community as I learn.

Thursday, May 1, 2008

hardy upgrade notes

I upgraded to hardy a few days ago. So far, its been a pretty smooth transition.
A couple of points...

VIM crashed whenever I used the tab text expansion helper integrated with rails.vim (have I ever mentioned how much I love rails.vim integration!). Anyway, got the latest source for vim and recompiled and all is well on that front.

Firefox 3.0b5 - this is probably the biggest pita of the experience. I use Firefox as more of a developer tool these days then a regular browser. So, its all pimped up with my fav add-ons, extensions, etc. A few of which were not happy with Firefox 3.0b5. Here are a couple that come to mind being a problem:

  • Firebug - fortunately the great folks here have an alpha 1.2 version that seems to be doing the trick for me

  • Html Validator - I found posting somewhere (that now I can't seem to find again) that suggested libxul-dev would help. Which it did.

Monday, April 28, 2008

congrats to TCBTB

Congratulations to The Cast Before The Break on getting a spot on the Bamboozle line-up this coming weekend. I know you guys have worked your asses off to get to that festival - have a blast with it!!!

Great show this bast weekend in Oneonta, it was excellent to see you guys again! I can't believe you opened with Understanding the Universe - it was the first time I saw it live and it sounded great.

Kick ass at Bamboozle!

Sunday, April 27, 2008

goruco2008

I attended GORUCO yesterday.
What a great time; great presentations, great after party, great people.
I didn't bring a camera, but others are tagging photos from the event.

Some of my highlights of the event:

Bryan Helmkamp's Story Driven Development. I've looked at Rspec a number of times, but have never gotten my butt off of standard rails unit test. The new story telling framework and webrat integration with Rspec has inspired me to schedule some time to look at this again. I really liked the feel of the "given some situation", "when this happens" "then I expect these results" syntax. And the webrat dom integration to fill in form field with syntax as simple as "fills_in last_name" with 'shaw' " looks sweet.

Giles Bowkett's - presentation. OMG! As soon as this is up on confreaks check it out. I really can't begin to explain this presentation, you have to see it to understad. Make sure the kids aren't around or at least the headphones are on. He was on a rant about VC firms being muppet f'ckrs that was pretty hilarious.

Chris Wanstrath - Forbidden Fruit - Ruby's parse tree. I'm a fan of his Ambition project and enjoyed hearing him speak.

Ezra's Merb 'All you need, none you don't' - I've been keeping an eye on merb since Ezra first started publicly talking about it. So, hearing him speak about it was cool. However the really really huge news was the information about what his is doing for Rails. This is huge news. Ezra is taking what he has learned from writing merb, and giving back to rails. He has been hacking away at Rails ActionPack and is removing the legacy cgi dependencies. He has reworked action pack to play nicely with Rack, and in doing so has supposedly made some improvements in the size of the rails mutex lock. This is really huge news. He mentioned that he is devoting something like his next 6 weeks to work on getting this stuff wrapped up and into rails core. I just realized that he blogged about this too. Wow!!!

The after party was a blast! Engine Yard hosted the after party at the Tribeca Grand; open bar, wi-fi, wii gaming, and general meet and greet stuff. Before the night was over I got a chance to say hey to a few of the guys I was looking forward to meeting.

Great event!

Wednesday, April 23, 2008

rails input text size - redo

A few days ago, I posted about using the DEFAULT_FIELD_OPTIONS const in Rails to override the default options which include a field size of 30. In that post, I put the const definition in my application_settings.rb initializer. While I do like to put all my application const definitions here, in this case it was a slight problem. Well things worked, but there were warnings at app start up that I missed about the constant already being defined. Duh... my application_ruby isn't loaded until after rails is loaded, which is a bit late.

In this case it probably makes more sense to revert back to the old environment.rb that we now try to keep our paws out of. So, I added the following to my environment.rb, and now I get the desired result without generating const already defined warnings.

module ActionView
  module Helpers
    class InstanceTag
      DEFAULT_FIELD_OPTIONS = { }
    end
  end
end

Saturday, April 19, 2008

specifying rails input text size

I was bothered recently when building some forms by the default size attribute of 30 that the rails text_field form helper method adds by default to all text fields if one is not specified. More times then not, I don't want a size attribute specified, unless I explicitly provide one. Otherwise, my desired behaviour is browser default text box, which will then be presented with style.

Rails, makes this simple; just override the default values in the DEFAULT_FIELD_OPTIONS constant.
I did this by adding the following to my application_settings.rb (I've gotten in the habit of putting a lot of my constant definitions here instead of cluttering up environment files with constants)
ActionView::Helpers::InstanceTag::DEFAULT_FIELD_OPTIONS = { }

This works because the InstanceTag class in the form_helper.rb file checks to see if DEFAULT_FIELD_OPTIONS has been defined before it defines it for you. Opening up form_helper.rb shows me that the only defaults in this constant are { 'size" => 30 } so, I'm good just to set it to an empty hash in this case.

Monday, April 14, 2008

stupid phone calls

This is why I don't pick up my phone most of the time when it rings during the day...
I answered a phone call today that by the caller id was obviously a sales call, but the phone kept ringing and it was driving me crazy. 

Me: hello

Caller: ~...this is Dell Financial services, we apologize for the inconvenience but we called to tell you about new options available with dell protections plans. Unfortunately, all of our agents are busy. We will have to call you back at another time.”

Me: hang up phone and stew in frustration, never wanting to purchase anything from Dell again.

I realize this is not just a Dell thing. My guess is that it is probably another company altogether that Dell outsources these "up sale" phone calls to. But, you know what Dell? Little crap like this really pissies me off. 


So, I get interrupted from what I was working on, get my butt up out of my chair, run to make it before the 3rd ring so it doesn't go to the answering machine, and worse of all loose my concentration on what I was working on. I do this so Dell's customer service calling queue can tell me that: a) they called me, but b) don't really have the time to speak to me. 

Are you kidding me? Why don't you just come out and say "Mr. customer, your time means nothing to us, our time means everything to us" 

Wednesday, April 2, 2008

iZoca

Towards the end of 07, I started flirting with a thing called iZoca.

Jeff first introduced me to his idea in November, but it took a month or so for me to admit that I was infatuated with the plan. My involvement became official in December, but just a nights and weekend capacity while I paid the bills with my existing contract day job work.

That all changes today! I'm now making a full-time run at this iZoca thing, and I'm convinced that we are building something great. Our team is small, but very motivated and all very excited about what we are doing. Our board is incredible, and I am humbled to be involved with such a great group of people.

There isn't much I can say about what we are building just yet, but stay tuned. Give us a visit at www.izoca.com to learn more, and if you feel compelled you can register for some beta info and we will inform you when it becomes available later this summer.

From a technology standpoint, here is a list of things important to us:
Ruby - oh, how I love Ruby
microformats
OpenID
iCal
semantically meaningful markup
valid xhtml/html
unobtrusive JavaScript

And some of the tools, utilities, frameworks, etc. getting us there:
Ruby on Rails
MySql
HAML
JQuery
git (this is new on our list, and I didn't believe the hype until I tried it
myself; now its an important part of the list)
Capistrano
VIM
Firefox Developer tools, Firefox Firebug
espresso

cool stuff. cool people. cool approach. cool ideas. long days. longer nights. start-up
lifestyle. Wow...these are definitely some exciting times!

Friday, March 14, 2008

music

The right mix of intelligent lyrics, great vocals, and harmonies all backed by a foundation of explosive rock is something that I'm always on the hunt for.

A couple of weeks ago I saw it live in the small town of Oneonta, NY., by a college band with a big name; The Cast Before The Break.

Google them.

Check them out.

If you can see them live, it makes their great cd “As Your Shoulder Turns On Your” all that much greater; trust me.