April 2008 Archives

April 30, 2008

Taking the Social Networking Plunge: Using Twitter

Lately I have been sucked into the Social Networking scheme.  I think someone is trying to tell me something about individuality and the need to collaborate, but I have been creating social networking logins all over the scene.  It started with my sister-in-law convincing my sister to sign up at GoodReads, and my sister subsequently sending it to me.  I didn't see any harm in it, as it was a place for me to post the books that I have read and currently reading.  Not too invasive, right?  Next, my colleague introduced me to Last.FM, which is quickly taking the place of my regular iTunes materials.  After all, where else can I get Abney Park mixed with Vernian Process and Dolphins?  It's now become an addiction, and I've only had it for one day.  The final straw now is getting a Twitter account.  I've never seen the point in spending so much time writing about what I am doing.  Isn't that what a blog is for?  well, yes and no.  A blog is great for longer thoughts, essays, comments, etc. that take a lot of thought and time to explain.  But what about those single thoughts, complaints about something not compiling, or little projects that just don't take a lot of time?  This is what Twitter allows you to do.  And I'm now using Twitter to keep track of my productivity.  I can use it to document the projects I do during the day, so I can look back and see just what it was I accomplished while working.  So, believe it or not, I've actually taken the plunge into the Social Networking scene.  Perhaps soon, I may actually become a social person.  ^_^

April 28, 2008

New Theme!

Generally I don't change designs of a site, mostly because I'm not a big design person.  But today I found a really cool Steampunk theme that I just had to implement.  I really like it, and I hope you all do too.  ^_^ I think I'll stick with this one, since it's a pretty nice theme.  At least until I find one that is a bit more in line with the topics I cover.  Let me know what you think!  

April 26, 2008

Autism or Asperger's?

This week we took my son in to the Psychologist for another evaluation.  His intelligence was evaluated, as well as his motor skills, and a huge focus on his interaction.  The psychologist was a little distant, perhaps a little exhausted by the end of the day, but did a fantastic job. What was interesting was his final conclusion:  He doesn't think that our son is particularly Autistic.  Probably based on the questioning look on my face, he mentioned that he thinks my son may in fact have Asperger's syndrome, which is similar to Autism but different on many levels.  This really took my by surprise, because I have been spending most of my time learning about Autism and had no knowledge of Asperger's syndrome. Asperger's syndrome begins roughly the same time as Autism, and is so similar in it's onset to Autism that many children are misdiagnosed.  The main symptoms are delayed speech and focus on specific items/subjects.  It actually is very similar to the overall "Robb" behavior, that of very shy boys that are rather knowledgeable in their chosen fields.  But if it's a common Robb trait, it's all been at such a minor level that it hadn't been an issue.  So how is this different than Autism?  Autism has a higher social impact, as well as a higher verbal impact.  Many autistic children have their intelligence affected (with the exception of high-level Autism), while Asperger's children tend to have higher than average intelligence.  And finally, Asperger's children tend to have a better chance of becoming main-streamed in society than Autistic children.  Now, the psychologist still said he will call it Autism for our son for now, because of one thing:  his almost total lack of interaction with people.  My son is very much in his own world, rarely interacting with other children or adults outside of a few select family members.  It actually takes some time before he "let's you in".  He just recently started to acknowledge my mother, which thrilled her pieces.  This is not a common trait in Asperger's, but is very common for Autistic children.  The one thing that has me hoping for Asperger's is the chance that my son could start talking within a few months, and in full sentences.  Autistic children have a lesser chance of talking in general, while Asperger's syndrome has a faster catch-up time for speech and communication.  There is one more disorder that I haven't mentioned, which is PDD.  PDD, or Pervasive Developmental Disorder (of which both Autism and Asperger's are a part), is more like a "middle" road as far as speech is concerned.  Children tend to start speaking around 7 or 9, and catch up quite quickly from there.  So, what does this all mean?  Well, it means that we continue to work with him.  The psychologist gave us a more targeted model for our son, so we can work with him more directly.  It involves parallel play that moves to interactive play, making him say "hello" and "good-bye" to everyone in the room, work on eye contact, and specifically work with common signs to get him communicating.  He will provide 30 things in particular to work on that focuses in these areas, which is perhaps the best thing we could have gotten.  So while I'm grateful for the help and support that Pre-School is giving him, nothing is more stressful in feeling powerless to help your child.  Now we know what to do, and why we need to do it.  We are now empowered to help our son, and can go at it with a will.  It's amazing what a targeted program can do.

April 24, 2008

Building in Second Life

Recently I purchased some land in Second Life, on the prestigious island of Caledon.  For those that are not in Second Life, Caledon is a Victorian/Steampunk island (or group of islands), with a great community.  Many are dedicated to education in all it's forms, so getting help is quite easy.  I found a couple of homes that fit the part, and I'm looking forward to posting them. Of course, there is a slight problem:  furniture.  While I intend to use my new home as a learning center, people need a virtual place to rest their virtual behinds and relax.  Well, I found many of the furniture pieces were pretty expensive (relatively).  Why?  Isn't it just scripted objects?  I found the answer to that when I tried to build my own couch.  <b>Building a Couch</b>The couch I knew was going to be difficult, but I wanted more furniture than what I found for sale.  So, I thought I would build my own. I started with five prims (or primitives, the building blocks of objects in Second Life), and made them the padded areas.  I then built a back and bottom using an additional 5 prims.  These I textured as walnut (because I liked the color), using a wood grain finish.  I then added 8 more prims for the feet.  Once done, I linked them all together into one object by selecting them all and hitting CTRL-L, and added physics to it (so that it will stay on the ground).All in all it took me 3 hours to make, and the finished product looks like something that came from a High School woodworking project from a student that wasn't that keen on the project.  But, it's my first attempt at building something, so I'm proud of it.  Once the house is complete, I'll post a link to it.  ^_^ So, now I know why people charge so much for their furniture.  Granted, once it's complete they never have to build it again, and can even make additional changes to it and create a whole new piece of furniture.  But as it took me 4 hours to make something that amateur, I would almost hate to see how long it would take to make something more polished.  But then, time will tell as I intend to keep up this project.

April 22, 2008

The Module Method: UNIX Style

Lately I have been feeling disappointed.  It seems that more companies are determined to keep their "proprietary" baggage with their products, instead of taking advantage of the open source projects out there.  Personally, I blame the Dot-Bomb period of the 90's when everyone and their dog funded kids that had no business model, but could boast a good URL and use of Linux.  Yes!  Linux!  I don't have to pay for it, therefore it has to be a good company!  Well, that ship had sailed, and the smart companies with business models came through the wash unscathed.  Many became brilliant Wall Street darlings, raking in profits like crazy.  And these companies usually fall within two categories: 
  1.  Open Source (or Open Standard) companies that feed off of support or advertisement revenue
  2. Proprietary companies that feed off of sales and license fees, as well as support and advertisement revenue
Now, we all know this already:  Open Source is very powerful.  But why?  Why is it so powerful?  Is it because it's "free" as in "freedom"?  Is it because you have a dedicated community working on it constantly, improving it, patching it, and so on?  While both are compelling arguments, I would argue that they are mere by-products of the real strength of Open Source:  Open Standards and the UNIX model.  
Yes, because of open standards, a user anywhere can interface with the company in their chosen way and still utilize it's functionality. Think of all the instant messaging protocols out there, and which one is most versatile.  I've found that Jabber works best for me, and it's an open protocol.  I do still use AIM for those few that choose to forego open source (or have devices that don't use open source), but for most of my communication I use Jabber.  
Jabber is a protocol that is open, using XMPP as the Open Standard.  As such, there are literally hundreds of applications and programs that can use Jabber.  And because of the Open Standard, it's possible to link the servers together through federation, allowing users from one Jabber server to communicate with another.  But that's not all!  Jabber clients know which Jabber server to talk to based on the user name of the person you are chatting to.  Can AIM, MSN, or ICQ do that?  Nope, because they all use their own central servers.  
So, at the end of this rant, it's that benefit of Open Standards that really shine for the Open Source community.  The good news is that more companies are seeing the benefits, and are moving that way.  I only hope that other companies will see the benefits of Open Standards, and move in that same direction.  After all, look at the benefits that the Railroad industry had when they finally settled on a standard rail size. 
Just imagine:  An electric car you by from Honda that could be beefed up with electric motors from a GM electric truck.  It takes interchangeable parts to a whole new level.  That's what I see as being a true benefit to the consumer.  

April 18, 2008

Scripting in Second Life

I've never really gotten into the whole "social networking" thing, mostly because I'm not that sociable to begin with.  There are times when I go into some social environments, like World of Warcraft or Runescape, but that's mostly to check on the platform.  After a while (usually about a month or so), I get bored with it or turned off by some of the people in there.  Also, I've known about Second Life for a while, but I never really had any thoughts about using it.  After all, what was the point?  And then I heard a report on NPR some time ago about an IV League school hosting some law classes in Second Life because of the lack of real classroom space.  That got the wheels turning for me...  Distance Learning.  Of course it's perfect for synchronous distance learning, because everyone is represented in the world by their Avatar.  Once there, they can interact either vocally or through chat, they can all have their say.  Also, instructors are able to utilize a more direct classroom approach, because they can read everything everyone is saying.  So, I thought I would start checking it out.  It's a great platform and has a lot of potential, but there are some objects that are missing.  I'm not going to mention what they are, because I hope to have them scripted here soon.  But, I needed to learn how to script in Second Life.  The scripting process is pretty straight forward, as it's similar to Java and C, but has it's own special objects.  That means there shouldn't be that much of a learning curve, and I can probably recycle some code in an object.  It's actually pretty fun to work in the environment.  I'm hoping to have some fruits of my labors soon, and it will also answer a huge problem I have had with distance learning for technical classes.  Stay Tuned!  ^_^

April 16, 2008

The Move To Dreamhost

For quite a while I have been looking for a hosting service that will give me a large amount of flexibility in what I want to accomplish.  Namely, I wanted to find a service that will allow me to host my blog, set a home page, link in a site for my book when I finally get it ready for publication, set up a learning management system for my classes, and allow me to start a collaboration project with my family on genealogy.  most options out there were fairly expensive, or didn't provide the level of reliability that I was looking for.  What to do?  Well, I finally found Dreamhost.  Now, I'm not saying that Dreamhost is the end-all beat-all, but it had a great offering for the price, with 500 GB of storage and tons of bandwidth, both of which continue to grow as the site grows.  I can also host as many domains as I wish, as many websites as I like, and even have ssh access to my server.  This means I can use sshfs to upload my site files, while also using my server for some much needed storage.  But that's all the reasons why I moved to Dreamhost.  Let's talk about the actual move.  The domain creation was all really easy.  Each subdomain is free, which is very convenient.  The Mail, Calendar, etc. is all hosted through Google Apps, which is very convenient in that it saves space on my server.  If only I could get the GCALDaemon working properly...  Anyway, setting up the domains was very painless.  Next, the services.  I started, of course, with the blog.  They had WordPress available as a one-click install, which worked beautifully.  It was ready for me to set up and configure almost immediately.  I then selected the current template because it was very light on the visuals, and soft on the eyes. The second service I set up was PhpGedView, which I set up on <a href="http://gedcom.robbclan.com">gedcom.robbclan.com</a>.  Then I tried to upload my families gedcom file:  it failed.  Why did it fail?  The program was set up for up to 7 MB files, and my family has a gedcom file that is 65 MB.  It would have been a problem, but I just uploaded it with ssh, and got it installed.  The only problem now is tweaking the program a little bit to allow updates to such a large file.  But that's something I will be working on later.  The last service I set up was my Moodle server.  I love Moodle:  It's an open source version of Blackboard or WebCT, and is arguably better in many aspects.  It set up quickly, and I chose a theme that was very telling (brushed metal).  Yes, I chose it because it's a popular Mac theme.  But as the majority of classes I teach are Mac classes, I thought it would be appropriate.  I then set up my classes, or I should say supplements to my classes.  They are mostly discussion and quizzes that I use for my For-credit students, and will allow students in the non-credit course to access and use if they so wish.  It's not meant to be a replacement or attempt to teach the class online at all.  I then set up the outline for an online course on how to Learn to Cook.  It's currently closed as it isn't nearly as ready as I would like, so don't expect to access it or use it.  Now, the only thing left to do is to create a real website.  Currently I have a placeholder with very little code, using the template for the blog.  I also properly cited the theme creator, as he deserves all the credit for the theme.  The website will be changing as I get more time to work on it, but for now it's enough to explain what the domain is all about, and has links to the various programs I'm working on.  So, that was my adventure so far!  Right now I'm getting my work machine ready for a reinstall, and then I will be working more on the site.  I also need to get my blog published and out there.  I don't expect I will be getting as many hits as I did from my Blogger account, but it will be interesting from an SEO perspective to see how the hits change.  

April 10, 2008

Distance Education Lab Potential: The Mobile Virtual Lab

Recently our department suffered a minor disaster that left our entire building without power for 36+ hours, and effected 45% of the University of Utah campus. For other departments, they were lucky enough to have this happen during Spring Break. For Continuing Ed, unfortunately, we were left with classes that had no power. As my department teaches Tech classes, we were very much in a crisis mode. We ended up moving computer hardware to a new location (from the Annex building on the main campus to the Murray center) in order to accommodate the class (which was Final Cut Pro).

This got me thinking though, what if it happened again, but this time in a class that was in a specific lab for a reason (such as Cisco or Linux). Would we be able to do the same thing? Not really. Our Mac lab is entirely mobile, being comprised of laptops (including the classroom server). But our PC's are Dell towers, and not very mobile.

Also, as has become a concern with our Linux offerings, many students want open lab time to work on exercises that were just not available to work on during class time. Our labs are not open labs because of that very reason, and once you make a change in a physical class, you can't really sit down at another computer and use it.

With these two issues in mind, I started looking at potential replacement options. The best thing that I could find was versions of virtualization, each with their own strengths and weaknesses. The four that I'm going to talk about here are MojoPac, Xen, VMWare, and LivePC. By the way, should I not be getting something on any of these deployments, please let me know!

What Do I Need?
What I need is a way to deploy a virtual environment to any platform in any location through a network or internet, while still having full control over those virtual environments. This means controlling access to the virtual machines as they are deployed, and access to the software within the virtual machines once they are deployed. And all this needs to work through a common denominator, namely home internet speeds. Normally this would be dial-up, but I think I would be even willing to push for DSL speeds. Also, it needs to be able to run without a network connection.

MojoPac
This wasn't the first solution that I checked out, but it is one of the most mobile. The idea behind MojoPac is to provide a software virtualization solution, which allows the user to store software on a jump drive, plug it in, and work on any Windows machine. it's a great idea for Windows (they need something like this desperately), as both Linux and Mac have applications that can run from anywhere based on their UNIX resource model (i.e., no registry).

This solution is perfect for a disaster scenario where entire lab software deployments can be quickly and easily moved into a new lab without moving physical machines. But what about taking it home? Now you have licensing issues, because it's highly likely that someone will want a "free" copy of the software to use for their home/business use. Now you have a major legal issue because security becomes a concern. Of course a simple solution would be to include the cost of the software in the tuition, which students already complain is too high. So, I started to look elsewhere.

Xen
I have seen few virtualization utilities that are as versatile as Xen. From imaging to virtual servers running on the same machine, it's no surprise that they were purchased by Citrix. Here you can create virtual environments akin to VMWare, or create virtual desktops that are served off of a central server. Both work well within a Network environment with large pipelines or as a stand alone virtual machine within a machine, and would again be perfect for a disaster scenario (assuming the Server were still functioning).

The problem? Well, the same as MojoPac really: students wanting to work on software at home. While the virtual desktop would be possible with VPN settings, it becomes more of a support hassle for the students and the network team than if they make the students come to an open lab on campus. So while Xen is a wonderful product, it's inability to stream the virtual machine to the student in their environment and run without network access becomes a missing cog. But, because it's Open Source, I'm still leaning heavily to it.

VMWare
VMWare and it's many options is perhaps the most likely competitor to Xen. Primarily though, I looked at the Player environment, where VMWare is able to deploy local images to the machine. This is perfect for students who want to take home an environment (provided they have an external hard drive), and they can run it on Windows and Linux for free (free player). Mac requires the purchase of VMWare Fusion. It's faster then other virtual environments that I've experienced on the Mac, and is pretty much universal.

The problem here is security and real deployment. Once the student has the image, what is keeping them from copying the image, and using it when the class is over? Nothing, so we are back to licensing issues. Also, it requires a lot of hard drive space for large images, which becomes a hardware problem. It's great for one or two images, but what if you want a specific image for each piece of software, so as to remove any potential compatibility issues? Do you have the hard drive space? So, I kept searching.

MokaFive LivePC
Then I found MokaFive's LivePC (before their change this week). Here was an environment that would basically stream from their servers or from your own a VMWare image that would allow a user to access their machine when online, and once it was cached they could even run it offline. So, we have a virtual environment that is streamed (not quite a Streaming Virtual Machine, but getting close) from any HTTP website using Apache and has continual HTTP access turned on, and it could be deployed from anywhere to anyone. Maintenance also becomes almost trivial, because instead of replacing the image, you just update it and it will dynamically write only the changes.

The only thing missing at that time was security. It still had the same problem as VMWare, and would have to be addressed the same way. Well, I left a post on the Forum to see if anyone had any plans for this particular issue. I thought about it all weekend, and came in on Monday, and checked the website again. It changed drastically, and provided a now paid-for Professional solution as well as a free solution.

The Professional solution allows for "revocation", which after contacting sales a couple of times I found out does exactly what Licensing would want: allow for revoking access to the machine regardless of how much has been cached. So, the other question I had was bandwidth usage, which is a huge concern. They replied back by saying that it uses very little bandwidth. What is very little? I'm not sure, but they are going to set me up with a 30 day free trial so I can test it out and find out what the HTTP load is on the machine.

So here is a solution that answers all my issues, as far as I can see. Is it perfect? No, because you need to host the link from your web server to their system which gives the security they promise. Perhaps in a few years they will provide a packaged solution that will allow the customer to install their image and engine software locally, but for now you lose control of your access by having it funneled through their system.

The Verdict
From what I currently can find and understand from the various virtualization programs out there, MokaFive's LivePC product seems to be the solution for which I have been looking. The potential is there not only for the disaster deployment or regular lab deployment, but also as a potential distance education platform.

April 8, 2008

History of Food: A Review

This weekend I finally finished History of Food by Maguelonne Toussaint-Samat (ISBN 9780631194972), as translated by Anthea Bell. I've mentioned this book before as I began it, and covered the development of societies and the gender roles that were naturally created from the hunter-gatherer society.

but the book is more than that, it began to weave in and out of history and legend as each new food source was uncovered. It became less of a history of man's development with food than the history behind the French development in each area (starting on page 365 "Part V Luxury Foods" in a 763 page book). It moves in and out of history, with little need for chronology. The central theme is, after all, the food itself.

Perhaps the most disappointing move in the book is the Franco-centric focus. Little is mentioned of the Asian development of food, other than it's differing characteristics from those of the Gauls. German development is only mentioned when similar to the French, and Italians are only mentioned in conjunction with pasta. Americans, I'm afraid, are nothing more than ridiculed for their concepts of food, and only mentioned positively when the tomato, maize, and potato are introduced to Europe.

There are whole dimensions of food that are left out in the cold to wither and die because it doesn't fall within the Gallic centerpiece. Now, I know that French cooking is supposed to be the pinnacle of the culinary world. I know that all the prestige of the culinary arts are tied into the French way of doing things. But there are too many facets of food to tie to just one culture.

I have long contended that culinary excellence has everything to do with the cultural heritage that it represents. Culture is alive in food, it thrives, and it spreads. And we in the United States have a unique position, if we can but grasp it: We have all these gastronomic adventures within reach, if we are but willing to try them. Greek, Italian (not canned, please!), French, British, Japanese, Chinese, Creole, etc. They are all there, all waiting for us to step out of our comfort zones and experience. And at the same time, learn a little respect for a culture that created this wonderful experience.

If it were not for the beginning of this book which covers the first development of food in society (which the French cannot with any hope of sincerity claim as their own), the book would have been immediately relegated to the shelf and never looked at again. The beginning is it's one saving grace.

Now, I can't fault Maguelonne Toussaint-Samat for his focus on French cuisine. After all, he is French, and therefore has a predisposition to the French development of food. His focus is also primarily within his home town, which I think actually declares his bias rather well. But it would have been more accurate to say "A French History of Food", or perhaps "A Gallic History of Food", rather than just "History of Food".

Anyway, it is till an excellent read, though if you are not interested in the specifics of French cuisine you may end up skipping several sections. But for nothing else, read the first four sections. That is where development of food in multiple cultures (at least in this book) is at it's most pure.

April 2, 2008

Mac OS X 10.5 Support Essentials: With Regional Appeal

This week I have been teaching Mac OS X Support Essentials, and I'm really excited with this group. I have two students from New Mexico, one from Idaho, one from Wyoming, and the final student from Park City, Utah. It seems that our Apple classes are starting to draw a lot of students from across the West.

The class is moving along nicely, though the content is still really high. The poor students are hitting their cognitive load rather quickly, and so we can't move more than a couple of chapters at a time. Unfortunately, that leaves us with about half the curriculum left to cover today. Luckily there isn't a lot that the students need to learn at this point because all the heavy learning happened at the beginning of the course. Now we are just covering Networking, Peripherals, printing, and the startup sequence. But the students already feel overwhelmed.

Looking at the materials again, while I still contend they are better than the 10.4 materials (by a long shot), the course should have been made a 4 day course. Of course that brings up a whole different concern about the price tag on the course which most students and their employers already consider too high. It's an interesting balancing act, particularly when you think about what is required, or expected, for this level of expertise.

Perhaps, when I have time, I'll run through the materials with my magnifying lens, and see if I can't find a better design for the course. Perhaps there are exercises that are redundant, or perhaps there are topics that are not that important. This all comes after I have finally had the time to write the testing software that I intend to create.

Finally, something that I would love to see from Apple, is a Learning or testing platform that could be run within a Virtual Machine and distributed through a network. Something like LivePC (more on that platform later, which has really impressed me!). It would make testing easier, and even easily distributed (though controlled through an access platform), so that more Apple Professionals can be out there. Perhaps if the requirement for the software to work would be to have it run on Apple hardware...
OpenID accepted here Learn more about OpenID

About this Archive

This page is an archive of entries from April 2008 listed from newest to oldest.

March 2008 is the previous archive.

May 2008 is the next archive.

Find recent content on the main index or look in the archives to find all content.