iPTV

2005-09-25

After spending a good portion of last night and this morning tweaking VLC options, I have successfully turned my iMac into an iPTV. The iMac really looks like it needs to be turned into a TV and I didn't want just another cable TV feeding me shows that aren't really that exciting to watch. I decided that I would build an iPTV that would allow me to stream what I want, when I want. Through the magic of VLC, I am able to watch UDP streams from anywhere in full screen on my iMac.

Classes, Music, College Life

2005-09-12

The last time I posted here was on move-in day. It seems like so long ago. What's happened since then? I found a shop called Computer Renaissance that sells used Apple hardware. They sold me a Blue and White G3 tower and a Bondi Blue iMac on my birthday, definitely worth my time. I've got Debian running on both of them now and am working on getting Apache/PHP/MySQL setup correctly on the G3.

Classes have been pretty good. I placed into Computer Science 2 which equates to Advanced Java Programming. I think it's exactly where I should be. Some of it is review for me, but enough of it is new to keep me interested. The professor, Paul Tymann, also happens to be the author of the textbook for the class, so he knows what he's talking about. The labs here are nice. Sun UltraSparc III workstations are a nice platform for Java programming.

I have media on my mind these days. Delivery systems, production, content forms, etc. and I'm starting to realize that it doesn't matter what form the content comes in, as long as it's something people want to learn, see, do, there'll be a loyal audience present. Take podcasts for example... The TWiT podcast is unofficially the #1 podcast available. Why? They have content that people want; conversations about technology that aren't just howto sessions. The audience doesn't care if it's audio, video, text, or a holodeck because the content is worth their time.

So, I can get just about any kind of content I want in just about any form in a matter of minutes. Whether it be a podcast on my iPod or a fan fiction ebook on my PocketPC or Laptop. The delivery form has been separated from the content, allowing the average person access to a broader variety of content than every possible before. It even allows the average person to create their own content to share with the world, at their discretion.

Not quite sure where I was going with that, but you see my point. Well, I'm off to aid in an act of anonymous piracy (don't worry, I'm not the motivating factor, just a technical assistant without any awareness of the legitimacy of the content)

Getting cozy

2005-09-12

This site has moved to the G3 tower sitting under my bed in the dorm. It's got a better connection, more RAM, and it's configured a bit more cleanly. DNS has been updated and everything should be switched over in a matter of hours.

Perl, IRC, RSS, and other assorted letters

2005-08-27

As you may have noticed, somebody posted a request to create an IRC channel for digg the other day. I checked it out on freenode and found that someone had made an extremely annoying bot to read the RSS feed. It would flood the channel once every hour with a copy of the current feed. Not cool.

I decided that I wanted to write a bot that allows users to search an archived copy of the feed dating back a few weeks. I had two reasons for this: The search on digg.com is terrible and in a usability context, most people will only use such a thing on IRC when they are having a conversation about an old article and want a link without having to switch to a browser.

It was around two in the morning so I wasn't up to writing an efficient bot, just a working one. I decided that Perl would be the right tool for the job. After a few hours of hacking away at the Net::IRC library, I had something resembling a working script.

The bot is divided into two scripts, slurp and shovel. Slurp reads the RSS feed and throws any new articles in a MySQL database. Shovel is the actual IRC bot that just sits in a channel and waits for somebody to type !slurp <phrase>. Once every hour, on the hour, a cron job fires off slurp and retrieves the feeds from digg, slashdot, and wired. I wrote slurp in a fairly modular fashion, so adding another feed is just a matter of changing the command line arguments.

There is currently no system in place to remove old entries from the database. I figure that I'll just see how long it lasts before I need to start expiring stuff.

I am currently running shovelbot in the ##digg channel on freenode and in #digg on efnet. I'm not sure how long shovelbot will last on efnet because I have been k-lined for 72 hours without a reason. I don't really care because tomorrow I'm moving to RIT where I'll have a nice new class B IP for every one of my boxes.

Anyway, I'll be posting a writeup of Shovel and Slurp in the code section as soon as I get some free time. With orientation coming up this week, I'm not sure how much free time I'll have. Perhaps I'll hang out at Java Wally's for a while and work on it.

Peace.

iPod shuffle second opinion, webcam

2005-07-04

Ok, I know I was overly harsh about the iPod shuffle in the last entry but after living with it for a few days, I've come to realize that it's an awesome player if used as intended. By used as intended, I mean stop screwing around with podcasts and strange sync ideas and just use it for playing random music. The autofill works great albeit a little slow. The only problem with autofill is that you end up with an extremely strange assortment of music. I solved this problem by creating playlists with different types of music on them. When I'm in a good mood, I load up the Good Mood playlist with autofill. It still ends up fairly random, but now all of the music feels like it "goes together."

So I was just ripping some legally owned CDs in iTunes and had an idea for a little time-lapse video of my mousepad. I tried to find suitable software for the Mac mini for about 20 minutes before I just plugged it into my Windows box and used the Timershot Powertoy from Microsoft... Quick and Dirty. Getting the images onto my website wasn't as trivial as it sounds though. My options are fairly limited because the powertoy doesn't support anything other than Network Places and simple files. I avoid FTP as much as I can so that was out of the question. I already have Samba shares setup between my Windows box and the media server so I took that route.

Every thirty minutes, the Webcam Timershot program snaps a fresh JPEG and saves it to a mapped network share on the media server. The media server has a cron job set at 30 minute intervals to scp that file to the web server. Not amazingly difficult, but smarter than your average file transfer. The result is a view of my desk for the world to see. Sorry about the quality, I'm not going to fix my lighting just for a webcam. I like low lights.

As I'm writing this, I've decided to switch to five minute intervals. Thirty is a bit long, allowing for almost an hour of lag between shutter and cron job.

iPod Shuffle and Podcasts

2005-07-01

The UPS guy finally showed up with my iPod Shuffle... I've been waiting all morning! While I am waiting for it to finish charging, I'd like to point out a few observations I've made.

First of all, the design is elegant and functional in traditional Apple style. A nice small package with all sorts of little details like a battery status light and laser etched letters that say 1GB on the USB connector. Very classy. Documentation is a little lacking as to the install process... No mention was made that if you already have the latest versions of iTunes and the iPod Updater from Software Update, you don't need the CD. A small annoyance, but a good waste of ten minutes.

I have only one legitimate complaint about the iPod Shuffle, and that is that automatic transfer of podcasts isn't supported with the Shuffle models. I was extremely disappointed to learn this when I connected the iPod as there was no mention of it on Apple's website, and I had to dig through the help files to find the smallest sidenote that simply stated that you can only transfer podcasts manually with no further explanation as to why. I have already filled out a feedback form on Apple's website about this. I just don't understand why Apple wouldn't allow users to do this when the code is already there. Furthermore, why wouldn't they at least mention it on the website? If I had known that the iPod Shuffle doesn't support podcasts, I would surely have upgraded to the iPod mini, or even the full iPod. I still haven't decided if I'm going to return it or not, pending a response from Apple. I really hope they decide to fix this. I paid a premium for excellent hardware, shouldn't that be accompanied by excellent software?

Overall, my impressions of the iPod Shuffle are good. The design is elegant, it integrates well into OS X and iTunes, and it just works without a hassle (unlike Creative's players that require at least an hour of driver and software installations).

Interrupted Existence, Distributed Class, Clustered Neural Network, Podcasts

2005-06-28

As the devout readers in my audience know... Nevermind, there aren't any. I've been without internet access for the last three days. It sucks. They're sending a tech out to take a look at my connection tomorrow so there are no guarantees that anything will stay live until then. I didn't realize how much I rely on the internet until it was taken away. I've gotten to the point where I feel severely deprived if I am left without instant access to virtually all human knowledge (and porn) during every waking minute.

I got to the Distributed lab to find that we had a bunch of new toys including some Myrinet and Firewire hardware. It took us about an hour to get everything installed and running because we decided to plug in the extra hard drives in all of the systems as well as double the RAM in half of them with sticks out of the unused systems. We didn't get around to installing drivers for the Myrinet stuff yet, I imagine we'll get to that tomorrow. We did however, get Ethernet over Firewire working. Very geeky. Unfortunately, it didn't even come close to the expected 400 Mbps. At peak, we were hitting 13.4 KB/s (107.2 Mbps) over the Firewire link. I'm still not quite sure why we're not getting full speed. Until we get that figured out, we've still got a nice Gigabit Ethernet backbone to sit on, with Myrinet coming soon. Once the system is fine tuned and everything is trunked, we should be able to get aroung 2.6 Gbps. Not bad at all for a system scrapped together from borrowed equipment.

Now that MPI is running on both clusters, we need to get some software written. I've been working on a neural network that I'm planning to extend to support MPI very soon. But I still don't understand a few of the key concepts needed to get an artificial neural network running. The biggest piece I'm missing is the learning algorithm. I have a few rough ideas for building it, but I'm just not quite sure how to approach it. Perhaps I should try to find an artificial intelligence professor to explain it to me. Or a good book. Either way, it'll be really cool once I get it working. From what I've seen, the biggest drawback to existing neural network implementations is a lack of processing power. I figure that a four node 64-bit cluster should have plenty of power for my purposes. Not to mention the six Alphas waiting to be clustered. I would imagine that the Alpha systems will deal with the floating point math better, but we will see.

Recently, I've started listening to the This Week in Tech podcast from the old TechTV guys. It's the first podcast I've actually gotten into to the point where I'm listening to every episode and am disappointed when it's not out on time. But I'm forgiving, I understand that Leo has better things to do than to edit Skype conversations all day. I just enjoy listening to the show when I can, it's a nice little break from reality. All of this podcast stuff got me thinking about how kludgy my current way of getting podcasts onto my Creative Zen Touch player is. I'm downloading each MP3 and copying it to the player manually. Not exactly a model of efficiency. So I downloaded a copy of libnjb and tried to build it on my Mac Mini. That was a joke. The library compiled fine, but the sample apps get undefined reference errors. I have a feeling that I'm missing a library path somewhere but I've recently become aware of how Linux and BSD differ in the area of library management and am still learning about the BSD side of things. I have to say that I'm starting to prefer BSD. I can understand why Apple decided to use it as a basis for their OS; it just works. The hours upon hours I've spent fiddling with various Linux distros trying to get things like X11 running are much easier with the BSDs for a few reasons. First of all, BSD is very well documented. The man pages tend to be magnitudes more comprehensive than the Linux distros I've used. Almost all of the problems I've had with BSD systems have been with oddball software packages not cooperating with the system. The problem is that a lot of software was targeted at Linux (usually Debian or RedHat) and the developers just assume that BSD will work fairly painlessly. In reality, BSD is more restrictive in some areas to keep code clean. For this reason, messy code doesn't play nicely. There are many many other differences between Linux and BSD and this blog just isn't the place to discuss them (though I'm open to any conversations you want to have in my underused comments section).

Anyway, where was I? Oh yeah, the podcasts. I found a nice little perl script that will pull down new podcasts from an RSS feed and save them to a file. I figure that I could easily integrate this into a script that throws the files on my MP3 player when I plug it in. One of the sample programs that comes with libnjb is called sendtr, meaning send track from my understanding. It takes a file path as an argument (maybe even stdin), and prompts the user for the tags for the file in the player's database. I figure that I could move the tagging stuff into arguments as well, and have it work hand-in-hand with the perl script to load up the player, without having to resort to ID3 hell. Unfortunately, I couldn't get libnjb and the perl script working at the same time on either the Mac Mini or my new experimental box running OpenBSD. I'll get it working eventually, it's just going to take some serious hacking around. Eventually, I'd like to have a script running in the background, watching dmesg for the player to be connected and update it as soon as a connection can be established. The podcast reader would run in a separate process every six hours. Six is frequent enough for me because I doubt I'll plug in the player more often than that anyway. Future features for this setup might include the ability of the system to delete old podcasts from the player for me, although I don't really see that being a problem, I tend to delete them through the player's interface as soon as I finish listening. It makes it easier to find new content the next time I go to listen.

Well, I think I've done more than my share of typing for one day. Peace.

A week in recap, Three hour cluster, Graduation

2005-06-26

Well, the last couple of weeks have gone by so fast I'm still not quite sure how I did it, but I've graduated High School. I haven't been updating this blog as much as I'd like to so here's a recap of the last few days...

Monday Morning

  • Got a ride with Jeff (a friend of my Mom's) to East High School to take the Spanish Proficiency exam. I got there at 7:50 and found out that it didn't begin until 9:00. So I sat in the Cafeteria and watched the East High students. I have been removed from the traditional school situation so I can now act as a (mostly) impartial observer. I found it refreshing to see just how little things have changed. The cliques still exist, though the strongest separations appeared between males and females... I got the impression that the girls alienated themselves, not wanting to interact with their peers. Hmm. Not surprisingly, I found that the teachers still did more harm than good, interfering with a couple of boys tossing a ball around and leaving the violent/abusive ones alone. As 9:30 rolled around, I began the exam... It took them thirty minutes to figure out that eight people were missing. I wonder how long it would have taken with student cooperation... I finished the test around 10:15, just a few minutes after the required minimum attendance. I knew I had done well on the multiple choice section and had completely bullshitted my way through the essay. But that was out of my hands, I simply did not have the knowledge required to write an essay in Spanish.

Monday Afternoon

  • I took the bus to school and finished my remaining math tests in three hours. I passed the last math test with an 81 (minimum competency is 80). I bid everyone farewell until Thursday, stopped in to chat with DJ for a few minutes, and went home.

Monday Evening

  • At 5:00, I got a ride to the Business Development Center for my Distributed Systems class. I fiddled with Gentoo a little more only to find that LAM-MPI has a few quirks to get it working with AMD64. DJ also brought his TViX system and we watched Stargate Atlantis while we worked. Lots of fun. Later that night, I did some digging around on Gentoo's Bugzilla page and found that I wasn't the only person having problems with LAM-MPI on AMD64. I added my report of the bug as a comment and left it at that.

Tuesday

  • I slept in, it was nice. I took the spare system that Joe loaned me and added a couple of extra 3-4 GB hard drives I had lying around so that I could use it as a "testing" system and moved it to a nice cozy spot under my desk, previously occupied by dust bunnies. I installed OpenBSD 3.7 on it and christened the newfound system Calypso under my network's naming scheme. After a few hours of lounging around and watching DS9, I went to Distributed class again. When I got there, we started talking about what our goals for the next few weeks should be, I raised the possibility of building another cluster seeing as how the first one was almost completely working. The first cluster was built almost completely by Matt and I and didn't really give anybody else a chance to try their hand at the magic known as software. So Joel and Chuck went off with Joe and started building their cluster out of three of the P3 systems that were lying dormant on the far side of the room. When Matt showed up, I brought him up to date on the LAM-MPI problem and he suggested building it straight from source, without Portage. I had pondered the idea myself a few times but didn't have the necessary knowledge of LAM's workings to set the configure options correctly (What is ROMIO anyway?). So Matt got that compile going and as we were waiting for it to finish, I decided that I wanted to build another cluster too... It looked like everybody else was having too much fun without me. I hooked up four more of the AMD64 boxes and started installing OpenBSD on the first one. We only have a four port KVM so we decided to setup just one system at a time. The setup went surprisingly quickly once the packages were downloaded. I figured that this bottleneck could be eliminated by setting up a download mirror locally. As the first OpenBSD system, hereby known as node05, downloaded the required packages, I started the install on node06 and went upstairs to the gas station next door for some Twizzlers and a bottle of water. I also watched Matt put my recommendation letter for RIT in the mailbox. When I got back, both systems had finished their downloads. I finished the install on node06 and started tweaking it to match node05... I'm terrible about changing things before I've got everything setup. I configured Apache on node05 so that node07 and node08 installs would be much faster. Indeed they were. Both of the remaining nodes took only 20 or so minutes to setup, making the total setup time for the cluster under three hours. I continued tweaking all of the systems to get them to match fairly closely and began adding a few necessities... NFS, Bash, Passwordless SSH, and LAM-MPI. By the time I got to the LAM-MPI install, Joel and Chuck had the second cluster up and running with Debian Linux. They were having a little bit of trouble getting NFS working but eventually managed to get it to the point where the systems were fairly dependent on each other. 9:00 rolled around and I went home. A good day.

Wednesday

  • I woke up with but one thing on my mind, LAM-MPI. I had forgotten that we had started the build of it on node01 for the Gentoo cluster. I grabbed my laptop and SSHed into node01 to find that the build had gone flawlessly. After a little poking around looking at the results of the configure, I decided that it would be safe to install it on top of the Portage install. I waited a few minutes for the install to finish and tested it out. Still no luck but it appeared that the old libraries were still in use. I uninstalled the Portage version without any more success. Then I remembered that there was a way to set the library paths. I stumbled around the command line for a few minutes until I remembered that /etc/ld.so.conf controls libraries. I set that up to use our /usr/global path and found that it worked! Apparently a clean 64-bit library was definitely what was needed. I logged out feeling good about the success and lounged around, watching a few more episodes of DS9. Once I was tired of that, I started working on my neural network program some more and got it to the point where it ran without any problems. I am still lacking a learning function but it's well on the way to being a productive program. Eventually I'd like to make the neural network an MPI app so that I can put some really powerful hardware behind it.

Thursday Morning

  • Nick showed up at my house at 8:10 to get a ride to the graduate breakfast picnic thing with Dave. Dave showed up around 8:15 with a trailer of Kayaks in tow. We got to a nice little spot on the side of the river somewhere around Lindley... I tend to lose track of all of those small towns. We hung out skipping rocks across the water for about 45 minutes until Kristy, Nate, and Brandon showed up. I ate a donut and helped to unload the Kayaks. I said to Dave, "You're going to get me in one of these today, aren't you?" Without really protesting it, I got in the water and loved it. I hadn't been that close to water (excluding showers) in at least three years.

Thursday Afternoon

  • I got to school and hung out with Jessy for an hour and a half until things started to get going with Kristy's POL and Graduation setup. I went over to the cafeteria and setup the PowerPoint of the pictures of activities the school has done. From there on out, it was just a bunch of talking and shaking hands. I don't really like goodbyes, but this one wasn't that bad, because I didn't really feel like I was saying goodbye, just see ya later. I went to dinner at Olive Garden with my family and off to bed. A lack of sleep will do that to ya.

Well... Those are the most interesting parts of the week. I'll try to keep this up to date more often. Peace.

Slashdot's last breath, Voices, Kevin Rose, RIT

2005-05-24

Over the past few months we've seen a major change in the way we view the web. RSS has given everybody the ability to instantly reproduce live content from a myriad of sources. Google News has aggregated the remaining sites that don't offer such services yet or offer it in such limited capacity that it is practically unusable. User-driven sites such as digg are a welcome change from the strictly moderated forum known as Slashdot. I'm not proposing that Slashdot should disappear completely as there is still a place for a site that can guarantee a certain level of quality in the articles whereas digg suffers from the possibility that an abusive user would raise the rating of an article to suit their motives.

Podcasts are also causing a stir in the online news reporting world. Marrying the concepts of talk radio and file sharing, podcasts are moving users away from the web browser and into alternative mediums for delivery of what is essentially the same content the difference being that the number of podcasts is still relatively low compared to news sites and therefore are subject to a bit more opinion and control by the authors which isn't necessarily a bad thing.

That of course leads me into the largest change in journalism as of late: blogging. Blogging gives everybody a voice, no matter how different your ideals, ethics, values, culture, country, government, friends, family, co-workers, religion, and other affiliations. Everybody has a voice. It is ironic that the internet is finally being used for what it was designed for in the first place: the transfer of facts, opinions, and supporting data between different people. Comments give everybody a chance to voice and opinion on somebody else's opinion. Even if you don't have a blog, you still have a voice. I believe that this is the mentality that will drive new social and communications networks in the coming years, allowing everybody to say anything they want without fear of censorship or reprisal (unless you live in China, in that case you have my condolences).

For those that have been living under a rock without an internet connection (those rocks don't pay for services they don't need, they're solid people) you may have noticed that Kevin Rose, everybody's favorite script kiddie has quit his job at G4 to become a freeloader independent creative force. He released the first episode of his new show Systm after a few denial of service attacks and a flood against the IRC server (script kiddie jealousy I assume). A comment on Slashdot said "The system is down" taking a cue from our beloved StrongBad and The Cheat. Overall, the show was pretty damn good for an internet show, though still a little immature as to the content. In a nutshell, they built a box that uses the reciever from an X10 wireless camera and continuously modulates the "channel" switch between the four settings. Not an amazing feat but I have a feeling that Kevin is headed in a better direction with this one as opposed to his seemingly abandoned internet show, the broken.

I love the idea of internet TV shows. There's just a sense of purity to these things that are untouched by corporate politics and the constant advertising that encapsulates our lives. Unfortunately, I don't think that things will stay this way for long. At some point, people like Kevin Rose have to get paid and in my experience, money doesn't just show up in your bank account no matter how much good you're doing for society (or CS students on summer break). Eventually, a company like G4 will come along and ask if they can air his shows on their channel. From there, it leads to G4 saying "We'll give you an extra 20k if you mention Pepsi." Pretty soon, we're right back where we started with a weekly sponsored show under the iron fist of Soviet Russia G4.

Aside from my little essay on the future of social content on the internet, I haven't been thinking about much. I took another trip up to RIT last week to meet with some professors and get a bit more detailed perspective of what they have to offer. I talked to a professor in the Computer Engineering department that really knew what he was talking about which is exactly what I expected from the former head of the department. Looking at the labs and different projects they were working on helped me to decide that I would rather spend my time working on the software side of things. I enjoy the hardware work and it fascinates me to see how everything is put together at that level but I just have trouble seeing the depth of the system by looking at logic gates and instruction sets. In my mind, languages like C are dominant. I can visualize just about anything in C from the interface on down. Granted this may be because I haven't taken any of the Computer Engineering classes yet, but I prefer to work in an instant gratification environment. When I'm programming, I can run my code at any point in time and know if it works or not, the same is not true of hardware. There is no magic "stub" code you can stick on a board and see the results of the piece you're working on. So, after a rather short conference with a counselor in the Computer Science department, I decided that I will be going for a CS major.

At first glance, RIT's Computer Science curriculum is super easy and simple. On paper, it looks like a factory program. Once the structure of the requirements was explained to me, I found that there are enough elective credits available that this program really covers just about everything. It's only written with structure to appease the bureaucratic types but it really encompasses any specialty that you would want to play with up to the point where it's even possible to get a minor in another CS related field without having to take any more classes than the next guy. This type of flexibility is what is drawing me toward RIT's CS program.

On one final note, I'd like to request that anybody reading this please leave a comment on what you'd like to see here. Would you like something changed? Or added? Do you want a new feature for end users? Do you want me to get off my ass and fix the RSS feed? Let me know.

Xbox 360, PS3, and IBM's salvation

2005-05-17

For those of you that are completely oblivious to the gaming and technology worlds, I'll shed some light on the more recent developments at E3 (Electronic Entertainment Expo). The biggest announcements to come so far have been the Xbox 360 and Playstation 3. Not only are these extremely powerful systems, they're also going to change the way we think about home entertainment.

The Xbox 360 was unveiled last week on MTV in a somewhat disappointing light with the influence of today's pop idols drooling over it. Since then, we've managed to get an ever increasing amount of detail as to what the system contains. Here are the basic specs: Triple core PowerPC processor at 3.2 GHz and a custom ATI graphics chip for a total system performance of 1 TeraFLOP. Impressive to say the least. On the software side, the 360 runs Windows Media Center Extender under the hood. All of the functionality of Windows Media Center Edition will be available provided that you have another Media Center box on the same network to host the data. Xbox Live, Microsoft's online gaming service will be a free service with every Xbox 360. A paid subscription to the service will yield a "Gold" level account with unnamed higher-level features.

Sony's Playstation 3 was debuted on Monday at E3 in an extremely Japanese fashion: Elegant graphics, music, and light shows with a narrative by the owners of the company, not the people that built the product as opposed to Microsoft's "hip and young" approach. Regardless of the presentation, the PS3 is also PowerPC based at 3.2 GHz using what they are calling a Cell processor. From my understanding, a Cell processor is basically a multiple-core processor with the ability to do IPC (Inter-Process Communication) on chip. From a computer architecure standpoint, this should be magnitudes faster. This is reflected by the initial specs released by Sony stating that the PS3 will be able to push 2 TeraFLOPs, doubling the performance of the Xbox 360.

As we have seen in the past though, big numbers on a spec sheet can be misleading. While the PS3 does have more power overall, 1.8 of those TeraFLOPs are in the graphics chip, giving programmers very little flexibility toward using the power for anything other than pushing graphics. The Xbox 360 focuses most of it's processing power in the system processor where it can be used in a truly multipurpose fashion. Time will tell if this really makes a difference or not.

From a graphics standpoint, both systems are going to be stunning. They both support full HDTV output in one form or another. The PS3 excels in this area with it's ability to drive two channels of HD output at 1080p whereas the Xbox 360 only supports one channel with a maximum of 1080i. For the uninitiated, this means that the PS3 can handle two displays at a "theoretically" higher resolution than the Xbox 360. In actuality, the number of people in the world with an HDTV are still fairly few and having two HDTVs is almost unheard of at this point. Again, time will tell if it really makes a difference. In my opinion, without the CPU power to create those graphics, the PS3 will fall behind with nothing to drive all of those overperforming output chips.

The biggest development to come from all of this in my mind is the fact that both systems are running PowerPC based architectures (with rumors of Nintendo's Revolution doing the same). There is also some fairly strong evidence to suggest that Microsoft and Apple are teaming up once and for all. I make the following contentions:

  • A PowerPC Xbox 360 means that Microsoft has ported Windows in some form to this architecture
  • Xbox 360 development kits are Apple G5 desktops
  • During the webcast of the Xbox 360 press release at E3, the stream cuts out for about five minutes with a message about "proprietary information", afterwards J Allard mentions Bill and Steve in the past-tense leading me to believe that they appeared on screen (or even on stage) together
  • Recent leaks state that the iPod will be able to sync/dock with the Xbox 360, this was confirmed during the webcast by an extremely obvious hint about "the ability to connect other vendors' players"

From these facts, I feel that it is safe to make an assumption that there is something brewing behind closed doors in Seattle and Cupertino. I'm sure that IBM is enjoying all of this, seeing that the architecture that they stumbled into is becoming dominant just about everywhere. I have to make a point of saying that I am definitely a proponent of the PowerPC architecture because it is RISC based and from a design standpoint is just better for many applications today.

Amidst all of these releases and platform confusion, the former king of the gaming industry Nintendo, is nowhere to be found. Some very obscure and small leaks regarding the dimensions and processor of the Nintendo Revolution have made their way to the surface but very little else. Either they're staying extremely tight-lipped about their new console, or they're having an "oh shit what do we do now!" moment.

Update: 5/17/2005 6:00 AM

  • Nintendo has a countdown on their website that will reach zero at 2:00 PM EST. A press release also available on their site claims a release party on Tuesday. I'll update this page as I know more.

Tasty Fruit, Art in another dimension, and lots of computers

2005-05-07

As some of you may be aware, I made the switch a few months ago with the Mac Mini. I simply love it. Everything is easier and just makes sense. But since my initial reactions to the Mac have faded, I've moved back toward using my laptop almost exclusively. You just can't beat the portability factor. So last week I installed Gentoo on the Mini and have been using it as my "Home away from home" system. In a nutshell, this means that I've been sshing into it to do anything more than web browsing.

So my use of the Mac was hindered for no reason other than the fact that it's not as portable as I'd like. Seeing as how I don't have the money to go out and buy another system on a whim, I've decided that this sort of setup will have to do until my laptop either dies or becomes completely outdated. I figure that either way, I'll be needing a new system in three to five years depending on how rapidly the technology changes (Moore has admitted that it was never a law, only a theory).

You may be reading this and wondering "Ok, why the hell did he mention this?" Honestly, I'm using the Mac again. The simplicity of OS X is too good to just leave alone. I can't just brush it aside because I need to sit at a desk to use it. Currently, I'm dual booting it with Gentoo so that I can still ssh into the box when I'm away and have a 100% GNU/Linux system. I'm planning to play with some of the hybrid package management systems like Fink and Gentoo's Portage on OS X in the next few days to see if I can find a way to work without having to dual boot.

I've been playing with a few new things lately, one of which has been 3D Studio Max. I'm really having a lot of fun modelling and texturing objects. I'm not quite sure why it's so much fun, it just is. A fork off of that interest has been distributed rendering. Most of the tools/programs/scripts for distributed rendering were designed for homogeneous render farms of high-end systems with node-based licensing schemes. There are many problems inherent in this for a student like myself. The most difficult one to overcome is the fact that no two systems on my network are the same and it's split about 50/50 between Linux and Windows. At the moment, I've gone back to just rendering on one system because it seems to be faster than any of the networked solutions I've tried.

Distributed computing alone has become a primary interest of mine. I love the idea of using lots of low-powered systems to build an extremely high-powered one. In pursuit of this interest, I'm learning to work with MPI in C. Using LAM/MPI on all of my Linux based systems has yielded me a large amount of computing power without adding any hardware at all. At the moment I'm running three nodes (four with my laptop) for testing and debugging a program called mpi_netperf. It's purpose is simple: Measure the latency of all connections in the MPI cluster. This means that every node establishes a connection to every other node, sends some data, and figures out how long it took. Each node then reports back to the master node with the times. Once I've got stable code running on my makeshift cluster, I can send it to Matt Haas and have it run on one of the many clusters at Geneseo's DSLab. I signed up for Matt's Distributed Systems class for the summer. He managed to get three more Alpha DS10T systems to build a cluster for us to play with.

The developers of Azureus have truly outdone themselves with the latest release. It includes a new tab under each torrent that presents a graphical representation of the torrent's network connections with each host in a circle around your computer with lines connecting them all. Blocks can be seen moving up and down along the lines. It is a simple and beautiful visualization of quite a bit of complicated data. That's a major part of distributed computing: The ability to represent all that good data in an easy to understand manner. Gotta love computer science, no?

Geneseo, Buffalo, distcc, and Gentoo

2005-04-19

I haven't posted here in quite a while so as always, some things have changed. I no longer have a girlfriend. She broke up with me. I don't really want to talk about that.

Yesterday I had an opportunity to see the Distributed Computing lab at SUNY Geneseo and Buffalo University's Center for Computing Research. Pictures available.

The DSLab at Geneseo was really cool. Two Xserve clusters, a video wall and an SGI Onyx. I walked into the room and the admin (Kirk), said "It's a little chilly in here. Let's turn on a cluster." The windows behind the video wall were open to keep the computers cool but they weren't generating enough heat. So he fired up the cluster of G4s and started running Einstein@Home. For those that don't know, Einstein@Home is a project similar to SETI@Home but they try to grok gravitational waves instead. Geneseo's DSLab is on the top 10 list for most productive teams.

After seeing the DSLab and touring the Data Center at Geneseo, we drove to Buffalo University with a group of Geneseo students in order to take a tour of the Center for Computing Research (CCR). Buffalo's CCR is home to the 22nd most powerful computer on the planet. It's somewhat humbling to be in the presence of such a beast. I enjoyed the tour quite a bit although I was a little put off by the elitist mentality of some of the people there.

Buffalo University has an obsession with Grid Computing. For those that don't know, Grid Computing is like taking every cluster (or computer for that matter) on the planet and building a giant cluster out of it. As you may already be aware, distributed computing alone is in it's infancy, so it seems like the CCR people are trying to walk before they can crawl. Their methodology for building the Grid is a little rushed from what I could gather. They are trying to build it with as much off the shelf software as possible, leaving no room for true innovation. They simply have the bigger computers to run the software on.

That being said, Matt Haas at CCC is going to be teaching a Distributed Systems class over the summer that I am already involved in as much as possible. It'll be awesome to build a cluster and be able to run my own code on it. I've already started writing software.

If you took a look at the pictures that I linked earlier, you'll also notice a few from CCC's Tech Showcase last week. It was a nice display of what you can do with a computer to get more students (and faculty) interested in the CS courses at CCC. One of the highlights was a four-node Knoppix cluster. I spent the day playing with that. We mostly showed off the things you can do with a clustered display system by building a video wall across the table. Rather amusing.

As far as Runemonkey news, I've decided that I will be standardizing all of my Linux servers on Gentoo. I'm doing this mostly for my sanity but also because having a homogenus software environment makes clustering that much easier. As I am typing this, I have three systems running a distcc build of Mozilla Firefox for my laptop. If you are familiar with my network configuration, the systems running the cluster are Apollo, Iris, and Athena. Iris is actually running Windows XP with Gentoo on VMWare. This is only temporary until I get another hard drive installed to make dual booting a bit more feasible.

Over the next few days, I will be rebuilding Artemis with Gentoo and saying goodbye to my beloved Petra Linux based on Linux from Scratch. I will definitely miss knowing the system like the back of my hand, but I decided that it makes more sense to spend my time learning a more widespread distribution. The other major factor in this decision was the fact that the Portage system will keep everything updated and secure without having to run security audits myself every few weeks.

So, as a result of the rebuild, all Runemonkey hosted sites will be seeing slow or disrupted service in the coming days. You get what you pay for.

On a final note, I will make an attempt to post at least once a month, if not more. I know it's been a while since the last update and this is only because I thought nobody was reading this. But from the comments I've gotten from various people, there are a few of you still hanging around and it seems like my readership is growing! w00t!