Monday, December 20, 2010

Engineer's Minimum Education Requirements

Finishing up reading through NSPE's November PE Magazine recently, there is an article Image and Education (NSPE membership may be required to access).  In it, the author Michael Hardy, P.E., F.NSPE makes some very valid points on why professional engineers should require additional education beyond the standard four-year bachelor's of science degree.  I applaud Mr. Hardy for revisiting such a difficult and almost controversial issue.

I specifically like how he points out the fact that engineering is a "learned profession."  He expounds on the history of engineering from a time when the general public did not have an advanced degree and therefore a four-year education was suitable.  But today, when a much larger percentage of the public holds a four-year degree, are we really any more learned than the next guy?  How can we expect to protect the title of Engineer, and have the public hold engineering to a high esteem, if we do not hold a quantitative measure of our "learnedness" above that of the general public?

But then, turn the page.  No really, on the next page of the PE Magazine November issue.

One Reason Why Additional Education is Not Enough
The next page shows a chart of engineering salaries broken out by identity: Engineering Technician, Engineering Technologist, Engineer, Professional Engineer, and Other.  I was not able to find the image, so I had to recreate the data.
Source: National Institute for Certification in Engineering Technologies' (NICET) Annual Salary Survey
I think this helps paint a clearer picture as to the controversy regarding additional education, and with it a whole slew of issues.

Engineers graduating with a four-year degree leave school with a sizable debt.  Yet, even after the four years of required experience (typical of most licensure requirements across the US), the typical salary is still between $40k and $59k per year.  That is the same salary range as all other engineering types and even "non-learned" professions.  Now why would any student smart enough to be an engineer look at those numbers and decide to invest in an advanced degree?  They would be graduating later in life, making an income later in life, owing more debt in student loans, and not make a higher salary to compensate for the increased debt.  The student would do all of that just to have two initials at the end of their name in a political climate that allows for industry exemption in most fields anyway?  In other words, they don't need to be a P.E. to do the fun engineering work that drove them into the career in the first place.  Looking closer at the chart, out of the 5,000 respondents to NICET's survey, a higher percentage of the non-professional engineers actually makes more than the professional engineers (for salary ranges above $59k).

Why Pay More for a Commodity?
Maybe the solution is to convince industry to pay more for licensure, since the esteem of having a professional engineer is higher and therefore the engineer would be worth more.  1) I doubt it.  2) Even without a difficult economic climate the engineering profession is on a long spiral of commoditization.  Raise the cost of an engineer in the US and that function will be shipped overseas to people who are just as intelligent and capable, but get paid less.  In short, the public welfare is still protected because foreign-born engineers get equivalent education and experience as US born engineers, sometimes even more.  And I don't want to sound protectionist just for the US.  There are plenty of other countries in the world having similar issues with regards to the engineering profession.

Conclusion
Additional education requirements beyond the typical four-year bachelor's degree is something that the engineering profession will be struggling with for many years to come.  As much as I wouldn't want to be shackled with increased debt upon graduation, I would still pursue the profession if that was my calling.  Look at physical therapists, for example.  They require the same education as a medical doctor, which equates to graduating with the same level of debt, but they don't earn nearly as much as an MD.  Yet, there are still plenty of Physical Therapists and PT students.  I believe that raising the education requirement for engineering is a necessity.  How to do it, though, is up for debate.  All I can recommend is that enough notice be given to future engineering candidates.  That means, 1) decide on the policy now, 2) make the policy active 4 years from now so high school freshman have time to decide on their careers, and 3) grandfather everyone else into the current system.  Then, industry has to be ready for a two year gap before they can hire graduate engineers.  That is the time between those graduates who have been grandfathered in and have left school and those who fall within the new rule have to get additional education.

I'm sure plenty of students will step up the challenge and there will be no shortage of engineers.  But, we currently need to continue increasing the awareness of engineering through STEM programs to young students -- grade school through high school.  We also need to get involved politically and work to remove or limit industry exemptions. Then, and only then, will engineering be a learned profession held in high esteem.

Thursday, December 9, 2010

'Smack's HP Elitebook Giveaway

image courtesy of SolidSmack.com
Am a greedy? No.  Am I a gambler? Yes.  And any time I gamble I like to improve my chances at winning.  Not only that, but by doing so I give you the opportunity to win at this exciting HP giveaway.

You see, the young Jedi daringly risked his life to wrestle this HP Elitebook 8740w away from bears and small children to be able to offer it to one lucky winner of this giveaway, and he wasn't even in line to see Santa Claus.

So if winning yourself a Christmas present tickles your fancy, head on over to SolidSmack.com and read the rules on how to enter this drawing.  You may have to up the ante though, by wresting dragons!

Monday, November 29, 2010

Final Days of Movember

It is the final days of Movember.  Thanks to everyone who has donated to this worthy cause and even more thanks to those who brought awareness to men's health.




http://us.movember.com/mospace/887612/

If you are feeling generous you can still donate.  Or, if you happen to be curious how bad I look with a mo, check out my MoSpace page from the link above.

Monday, November 1, 2010

Your Movember Public Service Announcement

It's that time of year again, November Movember.  That's right, there are no pink yogurt tops or pretty ribbons.  Movember is the month that men grow mustaches for health awareness, specifically prostate cancer.

Although started in Australia, this awareness has spread world wide.




The rules are simple:
  1. Start clean shaven on November 1st.
  2. Take a picture of your face, whisker free.
  3. Grow a mustache during the month of November.
  4. Tell everyone what the soup strainer on your face is for.
  5. At the end of November, go back to your regularly scheduled facial hair.

Some important points to add about the mustache itself.
  • You are not allowed to connect the mustache to any other facial hair:
  • You are not allowed a goatee.
  • You are not allowed a beard.
  • You are not allowed to connect to your sideburns for some epic chops.
  • You are allowed a "tickler."

In case you are wondering, I scoured the Googleplex for mustache examples.


I'm not a fan of facial hair, and it takes me a week to show a 5-o'clock shadow, but I'm going to do my best to make this happen.  If I can do it, so can you.

For More Information
Official Movember Site
Movember on Wikipedia
Follow Movember on Twitter
The Art of Manliness

There are plenty Movember teams to join.  Head over to the official site, register, and start raising awareness of men's health.  (By the way, it is also about raising donations for research.)
My Team

Friday, October 29, 2010

Cleaning the Waste - Nuclear Waste

I'm a proponent of nuclear energy.  I believe that it has the potential to be one source of the solution to replace fossil fuels.  But, before that can happen, a few things need to occur:









  •  The general public needs to be educated on benefits and shortcomings of nuclear energy.
  • Politicians and policy makers also need to be educated.
  • Policy needs to be adjusted to current levels of understanding of nuclear processing, not the over constraining regulations made decades ago to safeguard the public from unknown technologies.
  • Finally, develop a way to better handle the waste.
An article back in the July issue of ASME's Mechanical Engineering Magazine has a great article about some new technologies being developed to deal with the waste.  Beyond that, reading deeper into the article and the you can see how much has changed in the processing of nuclear energy.

The Hanford Site
The Hanford Site in Washington State just ten (10) miles from the Columbia River has 149 aging radioactive waste tanks that need to be emptied and cleaned.  The U.S. Department of Energy wants these tanks cleared in 30 years.  It has been a decade, and only six (6) of the old tanks have been cleared of required amount of radioactive waste (99%).  Quick math:
10 years for 6 tanks = 1.667 years per tank * 149 tanks = 248 years >> 30  year requirement
Enter New Technology
Obviously, they need to speed up the job.  But, the problem isn't necessarily with cleaning up the tanks, but rather getting access to the sludge (the hard stuff known as heel).  The tanks are buried under ground.  They are made of single-shell construction with steel sides and floor and reinforced concrete roofs.  They leak, resulting in groundwater with excessive radioactivity (thus the reason for the cleanup).  Access to the contents of the tanks, built between 1940s - at the start of the Manhattan Project - through 1980s is via risers.  The risers vary in diameter to as little as 12 inches wide.  Not too much can go into a pipe diameter of 12 inches to clean out hard heel.

But before anyone latches on to the fact that groundwater is contaminated, remember that this is because these tanks and the storage process was developed at the same time we were just learning about nuclear energy.  Of course, in its infancy, there is bound to be some learning.  The fact that these things lasted 70 years is a testament to engineers who designed them with the working knowledge they had at the time.  What is quite intriguing is the fact that the 149 tanks are being replaced by 28 new, double-shell tanks.  Our understanding of nuclear waste as well as how we handle it has vastly improved since 1940.  There is no reason to maintain policies developed by such limited knowledge.  It is time to update them!

Going to MARS
Although some people would like to just toss our trash into someone else's backyard (not considering the opposition to nuclear powered spacecraft), I'm not referring to the planet.  MARS is the the Mobile Arm Retrieval System designed specifically to enter the old tanks, break up the heel, and vacuum out the waste material.  But MARS will not fit into a 12 inch pipe.  Therefore, engineers have devised a way to reinforce the ceiling of the old tanks and then cut a 54 inch diameter hole that MARS can fit into, while maintaining negative pressure in the tank so no contaminants can escape.  Using MARS will do in one (1) month what the old method took several. 

The Final Steps
Much work has to be done to prepare for MARS use, but the results may mean that the entire site will be cleaned by the deadline.  Of course, that means all 53,000,000 gallons of nuclear waste has to be processed and treated.  Permanent storage will be in a stable glass form - something that is safe and didn't exist in 1940.

For More Information
Washington River Protection Solutions
MARS [pdf]
The Hanford Site

Tuesday, October 19, 2010

National Building Museum

The National Buidling Museum, just one more place I need to visit whenever I find myself in DC, especially when they have cool exhibits, like one on parking garages (which ended this past summer).

Parking garages, those often forgotten or overlooked buildings that require just as much engineering and design time as the actual occupied structure.  It's a good thing we engineers and architects can see the joy and beauty in the things often overlooked by the general public.  If not, how else could our urban development continue to grow off of our automotive-obsessed cutlure if there was no place to park once we got there.


ASU's Latest Parking Garage
Parking garages have evolved since the invention of the Ford Model T.  From simple barns to ramped concrete structures - thanks to advancements in column design and concrete - to automated parking.  And the advances in parking structures has been beneficial to other areas on the construction world, including city planning and eco-friendliness. 
Chicago Corn Cob Building - the lower floors are for parking


For More Information


Thursday, October 14, 2010

Esta noche todos somos Chilenos

"Tonight, we are all Chileans!"

With the rescue complete and the live updates on CNN finally over, the entire world can breath a sigh of relief that all of the trapped miners have been successfully freed.  So why is this disaster different than the Gulf oil spill?  The following is meant to be thought provoking harangue based on the limited or biased information received from news sources.  It not meant to be religious or political, although the points of the discussion may be.  It may not be entirely accurate, and some aspects are filled with best guesses to fill-in-the-blank, as news sources tend to leave out important details.  The point is, from an engineer's perspective, determine why things are classified or qualified in the terms they are.

Why is the mine rescue considered a success while the oil spill a disaster?
Weren't both events initiated by a tragic design or process flaw?  Yet, 2 months to rescue the miners is considered successful but months of stopping the oil leak is a failure?

I think the answer comes down to social and political aspects.  When the Deep Water Horizon exploded, the US government immediately got involved and BP management immediately got involved.  The goals of each of these organizations was not to stop the oil spill, but rather to save face.  These two groups got involved and started professing solutions before they even analyzed the problem.  And then, when real solutions presented themselves, these groups closed the doors on those opportunities because it would make their initial guesses look that more ridiculous.

On the other hand, the mine rescue did have management and government involved, but the goal from the onsite was to rescue the miners.  Government and management got engineers involved from the onset and came up with three unique plans.  Then, they initiated all three plans simultaneously.  Yes, initiating three plans at the same time meant three times the cost; and the cost of two of those plans would be completely wasted.  Compare that to the BP oil spill, where only one plan was worked on at a time.  When it failed, another was started from scratch in an effort to save money.  It wasn't until late in the process, much too late, that the engineers really got involved in the oil spill capping process.  Shortly after that, the problem was solved.

Why is God the reason for the successful rescue?
Why isn't he also blamed for the original cave in?  Why didn't got help solve the oil spill problem, or preventing it from happening in the first place?

I am a religious person and am always conflicted between my science and reasoning with that of my faith.  That's why this question is so interesting to me.  Weren't both of these disasters based on a similar root cause - a corporate culture or product flaw?  Then why is God involved in the solution of one, but not the other?  Why is God involved in the solution, but not the problem?  It seems to me that an earth-moving event would be in the realm of God.  Weren't engineers involved in creating the solution to both problems?  Didn't they come up with the ideas to drill the rescue shafts or cap and seal the well?

I don't have any answers to these questions.  I don't even have any guesses or an ability to apply logical reasoning to these.  My internal conflict between science and faith remains.

Where are the scientists?
I'll tell you where the scientists are.  Scientists discovered the oil under the Gulf.   Scientists discovered the ore under the ground.  From that point on, engineers figured out how to get at those resources.  And, engineers figured out how to resolve the crises.

Dear media, please learn the difference between a scientist and an engineer and use the proper title when referring to the two disciplines.

Is industrial exemption to blame?
Both of these incidents have sparked a resurgence of debate within the engineering profession regarding industrial exemption to licensing.  For years, governments have been changing policies to remove Professional Engineers (PEs) from positions that are designed to protect the public safety.  Why? To save money, usually.  And industry is happy to oblige because then they don't have to hire licensed engineers either.  They get away with industry exemption.

Since an oil rig is only used by the company who designed and built it, a licensed engineer is not required to be in responsible charge.  No (outside) public will ever be exposed to the oil rig and therefore it is industry exempt.
Since a mine is only used by the company digging it, a licensed engineer is not required to be in responsible charge.  No (outside) public will ever be exposed to the mine and therefore it is industry exempt.

Well, obviously, the public is exposed to an oil rig or mine.

Thousands of gallons of hydrocarbons flowing into public waters deserves to have the operation overseen by someone of suitable qualifications and ethics to maintain a safe environment.  Focus has to be not only on getting to the oil as fast and cheap as possible, but to also maintain a safe work environment and public well-being. 

A mine collapse effects not just the miners inside the mine or the ore output for the company.  A mine collapse effects the friends and families of the miners, it effects the rescuers and their families.  What if a mine collapse drastically changed the surface?  Wouldn't that effect water flow (rain or runoff)? How about nearby houses or buildings?  (Not necessarily the Chile mine, but for others this would be a concern.)  There are many aspects outside of the end product - oil or ore - that the public needs a voice.  The best way is to have a licensed professional engineer in responsible charge.  Granted, a single PE is not going to take on the liability of an entire airplane, but there should be more involvement in the process of the airplane done by someone who is not industry exempt.

Wednesday, October 13, 2010

Fighting Fires - Several Stories Up

  • We're happy that Mrs. O-Leary's cow didn't start the Great Chicago Fire.
  • We're happy that Chicago was able to rebuild using modern technologies.
  • We're even happier to know that fire fighting methods continue to improve, especially when it comes to fighting blazes in multi-story apartment and office buildings.

The U.S. National Institute of Standards and Technology (NIST) has a Building and Fire Research Laboratory.  As you can imagine, this laboratory researches fires: how they start, how they propagate, how to prevent fires, how to fight fires, and most importantly fire safety.  (They have some really cool CFD tools to simulate fire.)  But, much like the Great Chicago Fire, "greatness" usually comes after a tragedy.

In December 1998, three New York City firefighters lost their lives due to thermal shock that was created when the windows of the apartment building they were securing failed and coincidentally an occupant of the apartment complex on the opposite side of the building opened a door.  The wind, from being several stories up, rushed through the open windows and carried the heat of the fire through the hallway in which the firefighters were traversing and down the open door.  The heat was so intense that the firefighters' bodies shut down; they had no time to escape.

This tragedy, like many others like it, have opened the eyes of researches in understanding more about the world around us and inspired new ways to keep people safe.  This is especially important in apartment fires.  The National Fire Protection Association stated that 7,300 high-rise fires occurred in 2002, mostly in residential buildings.  Of these, 92% of the fatalities occurred because the fire spread beyond the room of origin.  Something as simple as closing a door may have prevented those fatalities.

Developing New Methods
If a slight breeze can cause such devastation in a high-rise fire, how can it be limited?  Better yet, how can it be used to help stop the fire?

Simple, when a window is open, close it.  With a big curtain.












One of the main problems fighting a high-rise fire is getting water to the upper levels of the buildings.  Ladder trucks have nozzles, but they only reach a certain height.  Sending in firefighters is the usual option, but that exposes them to dangers that may result in the same tragedy that occurred in New York.  Therefore, a concept developed by New York and Chicago firefighters is the high-rise nozzle.
Firefighters can enter the building from a floor not exposed to fire or excessive heat and deploy a water nozzle to the floor above.

Smoke inhalation is the largest known contributor to fire deaths.  Smoke also happens to be the largest inhibitor to quick fire response due to the lack of visibility.  Chicago firefighters have begun testing large portable fans to create a positive pressure in stairwells and rooms.  This controls the smoke and heat of the fire, sometimes containing it to the room of origin.














(above images courtesy of Fire.gov of Governors Island Experiments)

Proving the Point
Ideas are great, but how do you prove they work?  You do so, by teaming up with NIST and the Polytechnic Institute by finding a little funding through the Department of Homeland Security, FEMA, the US Fire Administration, and the Assistance to Firefighters Research and Development Grant Program.

Using NIST's fire laboratory, quantitative measurements could be taken that showed improvements in temperature, heat flux, gas velocity, pressure, oxygen, carbon dioxide, and unburned hydrocarbons.  Seeing positive results in all of the experiments using the ideas devised by New York and Chicago firefighters, the testing move out of the lab and into controlled burns.

Abandoned Military Bases
Where else better to light a building on fire than an old building owned by the government that is no longer in use?  How about Governors Island, south of Manhattan, and home of an abandoned military base containing several multi-story buildings?

These experiments verified what the lab testing had shown.  Controlling wind - either by removing it in some areas or adding it in others (positive pressure) - significantly improved fire fighting conditions and also made it easier for occupants to egress from the building.  Several fire departments from North America were present to watch the experiments and are now looking at implementing them as standard practice.

And lets not forget that great software.  With empirical data under its belt, NIST can continue to develop a theoretical model to simulate fires using a combination of Fluent and NIST's Fire Dynamics Simulator.  As more simulations are run and the effectiveness of the new tactics proven, you can bet that new construction will see implementations of these techniques built right in.  The simulations also provide a great method to train firefighters on these new tactics and show them the positive and negative effects of using them.

Of course, the part that interests me the most about WHY these tactics need to be developed goes all the way back to the Great Chicago Fire.  When Chicago rebuilt with high-rises, the technology that we have available today, like deluge sprinklers, didn't exist.  So now, all of those older buildings need new, portable, ways to control fires.  Hopefully, with these new techniques, we don't have another Great Fire in any city.

For More Information
NIST Building and Fire Research Lab
Polytech Institute
NFPA
Fire.gov

Monday, October 11, 2010

The Great Chicago Fire

The Randolf Street Bridge
I have always found the description of travesties as being "great" somewhat perplexing.  How can it be great that 300 people died, or 100,000 lost their homes?  How can a war be great?  How can something so horrible be terrific (horrible + terrific = horrific)?

While many of us this past weekend were celebrating the fact that Europe and the United States shared the same date (10/10/10), others were celebrating the anniversary of the Great Chicago Fire.

History
On Sunday, October 8, 1871 behind the home of Patrick and Catherine O'Leary, a small fire started in a barn or shed.  Unlike in today's litigious climate, Mrs.O'Leary and her poor cow got exonerated from starting the blaze because of evidence that clearly pointed to other problems.
  • The firemen were exhausted from fighting a planing mill fire the night before.
  • The equipment was in poor shape as well.
  • The fire department watchman called out the wrong location (firebox), sending fire crews the wrong way.
  • Upon noticing his error, the watchman called for another box (still not entirely correct), but the telegraph operator (dispatch) didn't want to sound another alarm and confuse firefighters.
  • A vigilant neighbor noticed the fire and ran to the closest fire alarm, but the owner of the alarm refused to sound it and prevented the neighbor from sounding it.
All of these systematic errors were more of a cause to the Great Chicago Fire than the original flames in the barn.

Codes and Standards
The main reason the fire spread so rapidly was because, at the time, Chicago was a city built out of wood.  Normally that would not be so bad, but due to the drought conditions at the time, the entire city was exceptionally combustible.  If the city had better building codes and standards in place at the time, there would most likely have been fire-stops in place.  We can compare that to recent tragedies: a 7.0 earthquake in Haiti and an 8.8 earthquake in Chile.

In Haiti: 230,000 died; 300,000 injured; and 1,000,000 made homeless (source).
In Chile: 521 died (source).
So how come so many more people died in the Haiti earthquake even though it was not as severe as the Chilean earthquake?  Answer: Chile buildings were designed and constructed using better codes and standards.

And Chile's could have been even less devastating but they didn't follow all the codes and standards because they recently opted to shortcut the standards in order to improve their economy using less than ideal materials and methods.  All the more reason codes and standards exist, and why licensed professionals need to be in charge of protecting the public safety.  Taking shortcuts or revising standards to make economical short-terms gains only results in long-term cost.

Beauty in the Decay
Amazing things happen in nature.  Lightening strikes sand and creates unique glass sculptures.  Fire softens glass and molds marbles together.
Glass Marbles
Or welds metals.
Metal Files

Rebuilding
What happens after a disaster is often the determination for it becoming great.  There is nothing great about war, but what happened after the Great War is nothing short of extraordinary.  The same can also be said about the rebuilding of Chicago from its ashes.

Business started cropping up within a couple days after the fire was out.
Rubble was swept away and piled up.  Much of it pushed into the lake, creating new real estate.
"World's Busiest Intersection" Corner of State and Madison
And it figures, the first store to reopen in the Burnt District was smoke shop - Schock, Bigford & Company, selling cigars, tobacco, grapes, apples, and cider out of a wood box with a broadsign in front.

Had the fire not destroyed four square miles in the heart of Chicago, the city would have been long delayed in modernizing.  Upon rebuilding, Chicago started using better materials and new technologies which allowed them to grow up, not just out.  New safety brakes on elevators meant the first sky scrapers entered the Chicago skyline.  Numerous other advancements occurred during the rebuilding phase that could not have happened if the old buildings had to first be torn down.  Which politician is going to force people from their homes?  I know that when I play SimCity and need to revitalize some neighborhoods, it is much easier for me to decrease funding to fire departments and call forth an alien invasion than it is to bulldoze the area.  Look at the aging infrastructure of the United States and the cost to dismantle it before we can build new.  Look at updated infrastructures of war-torn areas that would otherwise not be modernized.  The disaster itself is not great, but what comes after often is.
View from Water Tower looking North

Thursday, October 7, 2010

Organic Operating System

Life imitates art - Skynet becomes closer to reality that even Arnold Schwarzenegger is getting nervous.  When it comes to developing really cool technologies the world has never seen, the Air Force Research Laboratory (AFRL) is the place that is making that happen.

The AFRL in Rome, New York has been developing a self-aware computer system that automatically adapts and optimizes its behavior.  Cut off it's arm, and it uses another system to hold the gun to your head.  It does this through a new "organic" operating system (OOS).

The user sets inputs by setting a goal and allotting a budget.  The self-aware OOS has five key properties to accomplish the task:
  1. Ability to observe itself and optimize its behavior.
  2. Observe application behavior and optimize the application.
  3. Self-healing.
  4. Goal-oriented while optimizing constraints.
  5. Is lazy - using just enough resources to accomplish the task.
To put these into terms we can all understand.
GOAL: The Terminator has to kill Connor.
CONSTRAINTS: The Terminator can only kill Connor using the death tools of the timeline he is transported to.
OBSERVANT: The Terminator recognizes Connor as a threat and takes a defensive posture.
SELF-AWARE: Connor removes the Terminators gun-shooting arm. The Terminator realizes he can no longer shoot Conner with that arm, so picks up the gun with his other arm
GOAL-ORIENTED: The Terminator continues towards his goal under new constraints by shooting at Conner with his other arm.
LAZINESS: The Terminator didn't waste effort running after Connor or fixing his arm, he just picked up the gun and kept shooting.

Compare that to today SCARAs or industrial robots.  If a welding robot looses the welding head, it doesn't stop welding.  NOOOO!  It keeps following it's program completely unaware that it's not actually accomplishing its goal or has an appendage dangling by its hydraulic hose.  Industrial robots are not self aware and are not hell-bent on accomplishing its goal at all costs.

But that's not all.  It gets scarier.  Imagine a Terminator made of Liquid Metal combined with the ability to self heal.  That's enough for even Robert Patrick to be concerned.

I'd pay close attention to the AFRL.  While the commercial world is oogling CPUs and GPUs and APUs as great technical advances, the military is developing an OOS capable of taking over the world via its cloud computing capabilities.

For More Information
Defense Tech BriefsSelf-Aware Computing article
Liquid Metal Technologies
The Apple iTerminator
Skynet (just kidding, here's the real link)

(Solidsmack, aka Baltar, has been warning us! Or just desensitizing us for our new masters.)
Robots Will Digest and Poo You
Design the Robot Face of Death
Robots too Small to Carry You Off, Not Your Kids
Robots Won't Eat You, Just Take Your Job

Wednesday, October 6, 2010

A Better Grocery Getter

I've said it before, I love collegiate design challenges, and ASME's Human Powered Vehicle Challenge is no different.  Imagine pedal-power streaking you down the roads at 45 mph (72 km/hr).  That's enough for a speeding ticket in most neighborhoods -- while riding a bike?

This years challenge was a little different, and I would say much better, thanks to probing questions from the judges and and a new contest category.

Is It Practical?
The question posed by judges in competitions from years past was "Is your vehicle practical?."  The students from the team at Rose-Hulman could not answer "yes" to that question so this year, they made the bike practical.

Examples of the impractical nature of the prior years' vehicles include: difficult to ride; uncomfortable; required a team of people to launch it; and it needed constant maintenance.  Those examples aren't something grandma is going to be able to do on her own.

The New Category
The unrestricted class not only had teams competing in speed, but also in utility endurance events: climb a ramp, go over a speed bump, pass through a simulated rain shower, stop and pick up a parcel.  That's right, they want to make sure granny can carry her groceries home in her bike.  There is still nothing to prevent her from stopping right in front of the door, parking in the handicap spot, or going the wrong way down the parking aisles, though.

Congratulations
Congratulation to the Rose-Hulman  team for taking 1st place the third year running.
  

More Information

Wednesday, August 11, 2010

My New Desktop PC

A while ago I posted information on the trending topic of AMD and ATI.  The combination of CPU and GPU makers has been a match made in heaven for future tech, and I'm still very impressed with the direction the company is going and where the technology is moving.

As much as I wanted to hold out for an APU based system, I'm afraid that my old home computer had something else in mind.  The good news is, my wife gave me permission to buy a whole new system instead of just trying to find and replace the one bad component (which I later narrowed down to the motherboard - that's a story in and of itself).  Below are the components of my new system.  Although my old mouse, keyboard, and monitor worked fine, I just couldn't pass up getting spare parts and a nice 24" 1080p widescreen monitor.

The final cost was US$1334.89 not including the OS or Space Navigator.  I try to price entire systems below $1500, and I could have saved a lot of money by not getting the monitor.  I did save room for upgrades by buying components below the major price breaks.  For example, the AMD Phenom II X6 has a nice feature not included in the X4s - the Turbo CORE technology.  But, the price jump was too much to bear.  The motherboard will accommodate an X6 when the prices come down.  I also picked up the Radeon HD 5770 because the 5800- & 5900-series were up to twice the cost and more.  I have no benchmarks, but the Win7 performance experience gives good grades, 7.0 to 7.4 on all sections except for HDD speed, which is a 5.9.  I want an SSD, but they are still too expensive to justify for home use.

One thing I did notice right away is that my internet connection speed is lacking performance.  I knew this already, but it didn't matter because my old computer really couldn't process data much faster anyway.  Now, I am often waiting for data to come through my DSL pipeline before my computer has something to do.  Funny, QWest just came out with a new HDInternet.com website promoting its 40Mbs speeds, yet I can still only get 1.5Mbs in my area.  Come on QWest!  What will it take to get decent internet speeds in my neighborhood?

The only downside to this setup is the video card.  After getting into regular use of the computer, my screen would show vertical strips or just be blank.
This happened after coming out of the S3 sleep state.  Upon searching the internet for days, I found out that the 5700 and 5800-series graphics cards have a known problem in which users were complaining about since Jan/Feb 2010.  ATI supposedly fixed this problem with a hotfix to Catalyst Driver 10.1 around March.  I installed my card with 10.6 drivers and updated to the latest (and current) 10.7 drivers with no success.  Rumor has it the fix only works on 5800-series cards.  Thankfully, one website shows that overclocking the voltage or underclocking the frequency of the video card through the Catalyst Overdrive tool resolves this problem.
I have been stable for the past 3 days, but the solution also prevents my computer from going into the complete sleep state it once did.  I wonder if altering the Windows Power Management features would also resolve the problem.  If keeping it out of sleep state is all that is required, I may just reinstall BOINC and put my computer to use since I can't save energy.  The good news, even with the reduced frequency, this setup still outperforms my old computer based on my experience and feel.

Monday, July 19, 2010

Cold, Colder, Really Cold, and Construction

There are four seasons in the year.  When you are from the northern latitudes, they are cold, colder, really cold, and construction.  Construction season is in full swing and many of you have probably noticed the extra chips in your windshields by following too closely behind that asphalt-laden dump truck.  You may also have noticed the diesel belching, earth moving equipment slowing your commute in an effort to shave 30 seconds from your daily trek when construction completes... sometime next year.  But have no fear my tree-hugging aficionados, not only is that off-road equipment much more fuel efficient than in years past, but the methods used to repair existing roads are also eco-friendly.  (Note: the Caterpillar Rotary Mixer RM-500 is shown above because I have classmates that work for Cat and huge equipment is, well, it's just cool.)  I am referring to an old technique that is gaining new found attention, ROAD RECYCLING, sometimes referred to as road reclaiming.

Road recycling utilizes the existing road material to build the new road (over the same road bed and rights-of-way).  When done properly, the technique of Full Depth Reclamation (FDR) uses 100% of the old road.  That means no waste to be hauled away, fewer diesel-belching off-highway construction vehicles, and fewer chips on your windshield.  Check out the following graph for potential savings.


The process is pretty simple and straightforward.  Heavy equipment, like the RM-500 shown above, pulverizes and mixes the old pavement and road bed.  During the pulverizing, water or stabilizing agents such as cement are added to the old material.  The result of the pulverizing and mixing is a suitable material for a stable foundation that can be shaped, graded, and compacted into a new road bed.  The final step is a new pavement layer made of chip-seal, asphalt, or cement.  Although that means there is energy and materials in order to supply new raw materials for the pavement layer, it also means that all of those used tires can still be chopped up and used for a quiet driving surface instead of tossed into landfills.

I, for one, am pleased to see this old idea breeding new life into our roads.  We can carpool, take public transportation, or buy hybrid vehicles all we want to help save the environment, but the truth of the matter is we still need roads to ride on.  I like my roads pot-hole free and that means road construction.  Finding GREEN methods to repair or replace existing roads deserves just as much attention as the vehicles that ride them.  I think Captain Planet would be proud.

Information for this post provided by RoadRecycling.org and American Road Reclaimers.

Wednesday, July 7, 2010

Because No One Told Them They Couldn't

Congratulations to Bishop Kelly High School students in Boise, Idaho, for winning the "Best Overall" title for inventing the P.A.W.D., or Personal Assistive Writing Device during the National Engineering Design Challenge (NEDC) sponsored by the Junior Engineering Technical Society (JETS). (image courtesy of NEDC)
I love design competitions.  Where else does uninhibited free thought reign supreme?  No politics; no business plan; no ROI calculations; no worries about intellectual property or protecting patent rights -- it is the ultimate location for the entrepreneurial spirit.  Imagine the troubles associated with designing an adaptive device in the business world: E&O insurance, liability insurance, time to market schedules, IP safeguarding, just to name a few.  There is little possibility that something this simple, and this elegant, would have been designed.

What is even cooler about this design competition?  It is done by high schoolers.  I don't know about you, but I was pretty smart in high school and I still wouldn't have thought about designing an adaptive device much less actually make it happen.  Students these days are incredibly gifted if we would just properly guide them and keep all the crap out of their way.  And to see a high school that actually has an engineering department!  I may just move to Idaho.

So how do these students design such a device and win a national competition?  Are they that much smarter than we were in high school?  Do they just have a better administration and opportunities provided to them?  Maybe, but I think it comes down to the simple fact that no one told them they couldn't do it.  No one told them that you better not develop an adaptive device because someone will inevitably come at you with a lawsuit.  No one told them that insurance companies won't pay for an assistive device, so there is no monetary value in helping the disabled.  No one told them that great ideas shouldn't be followed just because someone else will be there to stand in your way.

I love design competitions.  They bring out the true entrepreneurial spirit.  They bring out the altruistic nature in those that WANT to do good, but are hindered by FEAR.  They bring out the creative minds unfettered by society's constraints.

Congratulations Coach Guy Hudson and the Bishop Kelly team.  Try spending some of that prize money on hair cuts.  {Teens these days.}

Thursday, July 1, 2010

Turn Your Design Process Upside Down

Reading through some industry literature today, I'm finding a trend done by best-in-class companies on techniques they use to remain best in class.  Specifically, I'm looking at the ways they have done their engineering design process.

I like this description of the design cycle (image courtesy of NASA).  But, I'm more interested in the detailed steps between 4. Build the Item and 5. Evaluate.

For example, I envision a team sitting around a conference room table 1. Stating the Problem that needs to be resolved and putting together a mind map during the successive brainstorming sessions to 2. Generate Ideas.  Out of those ideas, a 3. Solution is Selected and then the engineers go to work 4. Building the Item and sending it to the analysis experts to 5. Evaluate.  Of course, the results are presented and if the item does not resolve the problem, the cycle repeats.

What is the Most Efficient Way to Build and Evaluate the Item?

Another way to phrase that question is how to Design and Analyze the item?

The typical way is to take the idea and throw it into the designers' (design engineers') desk and let them build up a concept until they are ready to send it over to an analyst.  After the analysts run their simulations, they provide feedback to the designers to make the needed changes.  That back-and-forth cycle repeats until the resultant solution is acceptable, typically with the analysts time consuming most of the design cycle.

CAx software providers are helping to sway that process by bringing analysis tools to the designer.  By using ease-of-use, dumbed-down, integrated CAD/FEA, the designer can perform first-pass analysis and decrease the development cycle time by not involving the analyst until later in the process while still running multiple "what-if" scenarios.  WHAT IF THAT PROCESS IS WRONG?  We've all heard "blue is good, red is bad" from people who don't understand what accurate simulation requires.  Putting these tools into the hands of the inexperienced is dangerous and does not protect the safety of the public nor create higher quality or less expensive parts.  Having designers run first pass - which eventually leads to skipping the detailed analysis all together - may look good on the books, but that practice will come back to bite you.

The New Design Process

Rather than put analysis tools into the hands of the under-educated or less experienced designer, get the models to the analyst sooner - not later.  Let the analyst create a detailed and accurate simulation of the design and run the first simulation to prove the simulation's accuracy.  Then, turn the entire simulation over to the designer in the form of a template.  The designer is allowed to modify the design within the boundaries of the template.  With each modification, the designer can re-run the accurate simulation - not just a simplified first pass.  If the modifications overstep the boundaries of the template, the analyst will have to review the model and update the parameters of the simulation; run it to verify accuracy; and then hand it back to the designer to continue using the updated template for additional "what if" scenarios.

All Is Not Lost

The entire framework to integrate CAD and FEA is not wasted.  In order to get the new design process to work, a smooth transition between design and analysis still needs to happen.  Data needs to be interchangeable.  But focus needs to be on software features that allow development of a simulation template unique to each design.  The features need to include user-level control so designers can't just override an analyst's template.  The UI needs to allow simplification for the designer yet full control for the analyst.  Processing power needs to continue to increase to allow for complete simulations to run quickly.  Software algorithms need to be adjusted so only modifications (within template limitations) need to be reanalyzed, not the entire model.

The current trend of integrating CAE with CAD is on the right track, but simplification of analysis for the design engineer may be the wrong focus.  Instead, focus on allowing an experienced analyst set up simulation templates for the designer to use.  Keep the simulations complete and accurate while still reducing the development cycle time.  So turn the design process upside down.  Stop moving more FEA features towards the designer while delaying analyst input.  Instead, move the analyst forward in the process and more integral in the design cycle.

Tuesday, April 20, 2010

How Benchmarks Don't Stack Up

An interesting coincidence occurred today. AMD posted an article about how synthetic benchmarks don't represent the real world. This is yet another example, one a little more scientific than my previous posts on rethinking your choice for computer hardware (Intro, CPU, Motherboard).

In my prior posts I never came out and stated it so clearly, but the message was implied that simple metrics like CPU speed, FSB speed, or in this case benchmark results, don't necessarily paint an accurate picture on the overall system level performance of your new computer rig. What is most important is that, when selecting a new computer or computer components for do-it-yourselfers, you research each component and how well they communicate with each other.

Just like there are bottlenecks within networks, there are bottlenecks within computers. Loosing 2% performance on a single component may make zero difference to the system of a bottleneck exists elsewhere. Save the cost of the component and spend that money on improving the performance of the bottleneck. How do you find your bottleneck? That's a catch-22, because you need to run benchmarks in order to find it.

Here is a list of common benchmarks, care of Wikipedia.
There are plenty of others. It doesn't take an internet genius to do a Google search to find them.

Wednesday, April 14, 2010

The Mother of All Boards

Continuing my quest for a new desktop PC / workstation, I find myself searching for the perfect place to mount all the components. Since I decided to go with what I know, including anecdotal proof of why AMD is the right processor for me, I now have to find a compatible motherboard to mount it to.

This is one of those areas that I have to again decide to "go with what you know." The dizzying array of vendors, makes, and models of motherboards would confuse even the most technocratic junkie among us. The spectrum has to be narrowed and the easiest way to do that is to go with a recommended brand name. I have built two PCs in my days, both lasting over 5 years before becoming obsolete (with component upgrades along the way), and each time I chose an MSI motherboard. I have had no complaints to speak of, but there have been a couple sporadic issues that I haven't been quite able to pin down. One recent issue, though, is that a memory slot got lose, making it unusable. Since this motherboard requires matched pairs of memory, I'm short two slots and therefore only have half the memory capacity I want. (Of course, my computer is over 5 years old and I've been making period memory upgrades. The lose slot could have been self induced.)

This time around, based on recommendations from friends who also build PCs, I'm going to try a Gigabyte motherboard. Hopping over to Newegg, I start searching for Gigabyte motherboards that are compatible with AM3 chips (the Phenom II that I selected earlier). This narrows my choice down to 12 motherboards.

Narrowing the choices from this point are pretty easy. For example, USB 3.0 is the latest standard. I expect my computer to last a while, so I want it. I use an ATX form factor for my mid-size tower. Those two options narrow my selection down to 3 choices, which I can easily compare.

There are many specifications to motherboards. In all honesty, most of them won't make much difference to the average user. A few key features:
  • Front Side Bus (FSB) Speed - this is a bit of a misnomer, but still is somewhat representative of how quickly the components mounted to the motherboard communicate with each other. Because of changing chip technology and processes performed on-chip, comparing FSB between vendors or Intel vs. AMD can not always be done 1 for 1.
  • North Bridge - This will take research because unless you follow the PC industry closely, the code names on North Bridge (and South Bridge) will mean just as little to you as the code names for the CPUs.
  • South Bridge - see North Bridge.
  • Memory Standard - Shows how fast your memory is. Faster usually equate to better performance and you have to make sure you buy memory sticks that match.
Even if you have an AMD CPU, you can still get a motherboard with an nVidia (North Bridge) chipset. When I started looking at the technology behind the bridges, mainly by researching Gigabyte's, Intel's, and AMD's website, I noticed something new.

In the past, since I was more comfortable with nVidia graphics cards, I chose a motherboard with nVidia chipsets. They should "communicate" better with each other. What I've noticed, though, is that there is still a bit of lag, or incompatibility, or other sporadic issue with my setups that wouldn't allow me to squeeze out all the possible performance I should be getting. That's why this time around I spent more time and looked into using an AMD chipset with an ATI graphics card (and the fact that ATI has greatly improved since being purchased by AMD). My thoughts, keep it all in the family and everything should run smoother.

The latest chipset on the market for AMD is the 890GX. The price difference in motherboards between the earlier 790-series and the 890 is not too bad, so it is worth buying the latest tech if I want this computer to last. I tried comparing the options to an nVidia North Bridge motherboard and there really was no comparison. This was the final straw that put me over the edge to buy a complete AMD/ATI system instead of an AMD/nVidia hybrid. Using an AMD CPU, and AMD chipset, and an nVidia graphics card is really not a wise decision for high performance, stable computers, capable of being overclocked. It was time for me to really take a close look into ATI graphics cards. Not only was I surprised at how well the tech level of ATI compared to nVidia (not something you hear very often in the market), but I was also able to find a contact at AMD that answered my technogeek questions comparing nVidia technologies to ATI technologies and open standards.

If I haven't said it enough already, AMD and ATI has really surprised me. My default line of thinking has been shifted and I am switching sides for a while - assuming my wife lets me buy a new computer. Here's to competition and seeing the best of technologies coming from both sides of the field.

(I would like to continue this to expand on what I learned from my contact at AMD, but I will deviate as I cover some interesting highlights from COFES 2010. Stay tuned.)

Saturday, April 3, 2010

Selecting the CPU

The first thing I do when building a new computer is start with the CPU. As mentioned in my previous post, I prefer AMD chips. It’s a personal preference and if I were to follow my own advice from that last post, I really should research and not just buy the comfort brand. Despite my contradiction, sometimes you just have to go with what you know. After all, we’re not all industry analysts and we all can’t keep up to date with every bit of changing technology.

To start the CPU search, head to the home page of the CPU of your choice and research what the latest in chips are. I can never remember the chipset names. For AMD, I can only remember the Athlon, Sempron, Phenom, and Opteron names and the general classification they are used in. In my opinion, Athlon is a chip near the end of its life cycle, being replaced in desktop PCs and workstations by the Phenom II. Opteron is still the server class. And Sempron is something that readers of this blog should not even consider unless buying a PC for your child… that doesn’t game.

During the days of single core CPUs, the best chip to buy was the highest clock speed at the price break. In other words, I could spend an extra $10 or $50 to get the next increment in clock speed. Once I got to the $200 increment, I didn’t bother going any faster. I still keep the pricing website bookmarked. With today’s multi-core chips, changing cache sizes, and different Front Side Bus (FSB) speeds (or equivalent technologies), clock speed is no longer the only criteria for value. As a matter of fact, many applications are still single threaded so you should just buy the fastest clock speed you can afford therefore making the decision easy. (SWGeek is starting a good series on PC building on his new blog and you can learn more about CPUs by linking here.)

I’m selecting the AMD Phenom II X2. None of the applications I typically run can utilize 4 cores. No reason to spend the extra money. I really like the idea of the energy efficient models, even if they are 4 core, and would consider the cost difference if the clock speeds were higher. That narrows me down to model numbers 545, 550, 550 “Black Edition,” and 555 “Black Edition.” There is very little price difference, so my choice will be dependent upon what available on Newegg at the time I make my purchase.

It just so happens to turn out that my preferred chip (AMD) happens to be on the upper curve from the last post so I do get to stick with what I know AND follow my advice as posted. (You can get more information on Which is better NOW by PCStats.com even though they don't agree with me.) There certainly is some great technology in Intel’s core i5 and i7 chips, but not enough to push me into buying an Intel chip when I combine the whole system performance. In my next session, I talk about selecting the right motherboard. It’s doesn’t matter how fast your CPU is if the motherboard can’t move the information around quick enough. Motherboard technology is where, in my opinion, AMD inches ahead of Intel in terms of overall system performance. Following that will be why I think ATI graphics are currently a “step” ahead of nVidia.

Also see AMD CPU Roadmap from pcgameshardware. I may just hold off on buying a new PC to get the 32nm chips.

Friday, April 2, 2010

Rethink Your Default Graphics Card - Watching Industry Trends

I am not an industry analyst, by as an engineer I do have an ability to track trends. As I try to convince my wife that the tax refund we will be getting is best spent on a new computer for me, I have been researching components for my next big gaming CAD rig. (Not to mention the need for it to handle CADVille.) During my research, I found a repeating trend.

I should start by stating I've always been a fan of AMD CPUs. In my personal experience, they handle heavy math processes (like CAD, FEA, matrix and mathematical computations) better than Intel. No proof, just experience. I've also been more inclined to use nVidia based graphics cards. nVidia is the marketing name everyone recognizes and they appear in all the ads for best performance for specific applications, like the app was designed specifically for nVidia. I've also heard bad stories about ATI drivers and complex steps to update ATI video drivers. But as I was researching components for my potential new computer, I found something interesting which reminded me of the trend I have seen in the past.

Here's the trend I've noticed.
This is a simplified depiction of the trend. In it, you see both Intel and nVidia having technological leaps over time. (I depict a square saw-tooth pattern, but I do not mean to imply that the technology from these companies flat line. That is hardly the case; this is just a simplified illustration.) These are the big PR and marketing announcements that ring through the industry. On the other hand, AMD and ATI continually make steady improvements to their product lines over time. No big break throughs, no giant marketing campaigns, just steady growth. If you were to change the axes to Performance vs. Cost, you can see a steady increase in value from AMD/ATI and a step increase from Intel/nVidia. This trend matters because it clearly shows that there are times when nVidia is a better buy than ATI and when ATI is a better buy than nVidia. The same goes for AMD or Intel. If I were to put my finger on a point in the curve where we are now, I would place it somewhere at a point where AMD/ATI are slightly above the curve for Intel/nVidia in terms of value.

Not being very familiar with the ATI product line, I opted to probe AMD for some information on their cards, both the Radeon and the FireGL and try to figure out where the future of ATI graphics cards are going in terms of Win 7, DirectX 11, OpenGL, and some other technologies unique to ATI and how they compare to nVidia technologies. I managed to find someone willing to entertain my conversation and I was even showed a preview of the new FireGL V8800 model, not even released to the market yet. Let me tell you.... AWESOME!

Over the course of the next few blog posts, I will highlight what I'm learning about the latest in computer graphics. The thought I want to leave you with is this, don't just buy based on brand name and comfort. We're on a different point of the curve now, and your best value may just be with the brand you don't hear much about.

Thursday, April 1, 2010

CADVille FAQ

There has been a lot of questions about CADVille today. To make it more convenient on everyone, I have reproduced the original FAQ here.
FAQ:
  1. Will my friends be able to track my progress? Yes, we plan for CADVille to have a Facebook, Linkedin and twitter interface so as you build, trade, get or receive a gift, or unlock new tools, everyone on your network will know (look for the #CADVille Tag in Twitter).
  2. Does CAD Ville work with my Design system? We are providing an API called CADVilleOpen. If your cad system does not yet support CADVille, please call them right now and tell them “I want CADVille Support!”
  3. I do FEA. What about me? We are currently expanding the CADVille API to support the creation and trading of elements, nodes, and Solver tokens (good for 1 minute of solve time). Each “FEA analyst” starts with over a dozen nodes and elements!
  4. My company does not allow me to participate in social media or gaming while at work. Will I be able to use CADVille? Yes. When we charge your credit card, it will be listed under “Collaboration Tool”.
  5. My design tool only does 2D. Can I play too? Really? 2D? I guess we will let you play too but good luck selling your lines and circles and title blocks.
  6. This is really cool! Do you have any other product planned? Yes, we have several products under development. Watch for CADSquare, CADLife, CAD Wars, and World of CADCraft.
  7. This is a joke right? Yes. Many people are unaware of the April Fools tradition and we felt it necessary to include this in the FAQ and all product documentation.
Well, apparently it wasn't a joke to everybody.

Friday, February 5, 2010

Personal Message: Team in Training to Honor Past Friend

I'm going to take a short break from my regular topics and post a personal message. Not many of you know, but 2009 was not a very good year for me. In July I lost my father and a few short months later lost my good friend Armando, pictured here with his wife Traci.

With my father, I had some time to prepare from the date of the original diagnosis until his passing. Armando, on the other hand, was a shock. But through God, even the worst of tragedies can be turned into a positive.
  1. I'm shortening my list of good intentions. I'm sure many of you also have many good intentions, but turning those intentions into actions is not easy. I'm still working on it.
  2. I have found new interest in health and well-being. Many of you who follow me on twitter know that my tweets are as much about engineering and CAD as they are natural healing and new medical technologies.
  3. In a testament to how many people we touch each day, often without noticing, someone moved by Armando's story is going the extra mile. It is best to quote the message I got from Traci.
Hi friends!

I have some exciting news...The Leukemia and Lymphoma Society's Team
in Training will be honoring Armando at the Redman Triathalon in
September.

Finlay Woodruff, a co-worker of mine at Hilti, has been running races
and raising money for this organization for 5 years as a way to help
find a cure for blood cancers like Leukemia. He was honored at
Hilti's annual meeting this year with our Legacy Cup award for his
dedication to this charitable organization. At the meeting, he
approached me about running a race in Armando's honor. This will be
his first ever Triathalon.

I will be cheering Finlay on with support and fundraising help for
this race -- if you would like to help support him financially or just
learn more about Team in Training and what they do, please check out
his webpage below.

I'm sure there will be many other ways we can help support him as the
year goes on. Many of you know that my brother Justin is also an avid
runner, and he is joining Team in Training as well. We'll keep you
posted on how you can get involved in what we're doing.

I attached below the company news story that tells a little bit about
what Finlay does and also a link to his Team in Training webpage.

Love,
Traci

http://pages.teamintraining.org/gat/retriokc10/fwoodruff

All I'm asking is that you click on the link above and decide in your heart if this is a worthy charity to contribute to. Beyond that, sit back and take note of all the people around you. Take note of how you affect there lives, even in the smallest of ways. Take note on how your life positively or negatively affects them. Then, take note on where your life is taking you.

As my father used to say every time I left the house, "May God be With You."
I'll leave you with one stream I could find on Armando. He's a fabulous musician and I wish I had the page space to tell you all the ways Armando has touched my life. He is truly missed.
www.youtube.com/watch?v=lI2npqlXc9g