Our Summer 2014 Trip Part 3

Day 15 July 23rd
We were so close to the holy land, Notre Dame University, that we considered it our duty to make the pilgrimage. This day we flew from KDVN Iowa to KSBN South Bend Indiana. This was a very easy trip. I flew slightly southeastward to stay out of O’Hare’s airspace and just south of Lake Michigan.

kdvn-ksbn

 

I didn’t use flight following this time, as it was a short hop. We did see several other small aircraft in the area and I swapped between CTAFs for several of the local airports and was able to self-announce and navigate the slew of municipal airports without causing any problems for the local traffic.

The Route on SkyVector

The landing at South Bend was pretty much uneventful. I did however initially line up on 9 right instead of 9 left as instructed. I caught my mistake when I was still far enough out that it did not cause any issues for all the passenger jets landing there. I did confirm with approach at South bend that they wanted me to land on 9 left and then did confirm that I should land on the runway with the big “9L” painted on it. I guess it was a slow day and they were feeling snarky.

The local FBO, Atlantic Aviation loaned us a crew car for a few hours and we went on campus, visited the grotto and the bookstore as I needed a new hat. A quick shout out to the really great people at this FBO they treated us as nice as the folks coming in on the Lear jets. I had them top us off with fuel and this waived any landing and parking fees.

nd-grotto
After visiting the grotto, lighting a candle for all my aviation family, and buying a hat, we headed back to the FBO and traded the crew car for my plane. The ground control folks wanted to know our destination and we ended up chatting it up about us flying to Niagara Falls. The departure folks even wished us a safe journey to the falls, some really nice people in that tower. Our planned route took us northeast along the coast of Lake Erie and then finally north to the town of Lockport New York.

ksbn-0G0

 

Link to the route on SkyVector

flightaware-14G-0G0

The flight along the lake was easy going and we stayed on with flight following all the way. We did run into a few clouds on the way up and I had to climb up to get over the layer. Once we reached the northeastern edge of lake Erie, the clouds stopped and we made a leisurely decent into the town of Lockport. A few words about this airport, the runway was in rough shape as you will see in the video. The western edge of the runway was full of potholes and were marked with orange cones so I had to land to the right side of the runway. You will hear my wife’s comment in the video that it was a bumpy runway.

All during this trip I looked for small airports and local attractions that we could patronize. Lockport was a gem of a place to stay. We found a locally owned and operated motel, Lockport Inn and Suites, right out of the 50’s that was super nice and there was a fantastic Greek diner, Kalamata Family Restaurant, which served really good food. In fact I tasted some of the best haddock I’ve had in a long time.

Day 16 July 24th

We arrived late in the day and got a great night of sleep in our motel. The next morning, we rented a car and spent the day exploring Niagara falls. We walked around and looked at all the falls and generally had a good time as it was my first trip there.

us-at-falls

niagara-falls

To top off our evening we went to the local drive-in theater, Transit Drive-In Theater, and ended up seeing the Disney movie Planes 2 which was really fun. Lockport boasts one of the few triple feature drive-ins that shows three different movies on three different screens at the same time.

https://www.transitdrivein.com/

Day 16 July 24th

Today we departed Lockport in the mid-morning for KAQW Harriman and West Airport in North Adams, Massachusetts, where lives my mom.

0G0-KAQW

The trip was pretty short and easy. There was no fuel at Lockport, at least I could not find anybody around to unlock the pumps, so I flew to another airport not too far away. About half way to our destination, just west of a beautiful lake was a little private strip, that is open to the public, called Skaneateles Aero Drome in Skaneateles, New York. Pronounced scan-eet-alice. Nice place and inexpensive fuel!

The Route on SkyVector

After the refuel we flew pretty much directly to North Adams, which took us right over Albany’s Class C airspace. There was a considerable haze on the mountain ridges as we approached our destination, so I had to stay at a bit higher altitude than I planned until I was sure I had cleared the ridge line obstacles. Once I was sure we were clear of the wind turbines along the ridge, I descended into the valley and landed.

Landing at Harriman and West is a bit intimidating to the uninitiated, especially when landing on runway 29. The downwind leg takes you just over the reservoir which is a lake that sits on a hill at about 1200 feet. Just after coming abeam the numbers and reducing power you are suddenly looking at a lake just a few hundred feet below you on the side of a mountain. If one is not prepared for this spectacle, they often stay too high for the base leg or extend the downwind much longer and turn late.

A quick note about this airport. It is run entirely by volunteers. The city does not have the funds to maintain it so the local pilots keep it up all on their own. I think that is great and they seem to be doing a fine job. harriman-and-west-airport

We stayed here three days with my mom and enjoyed some home cooked meals. We visited a few of the local attractions and even drove up Route 2 to the top of the mountain so we could take a picture of the valley.

north-adams

Day 20 July 30th

This day we left North Adams KAQW and flew a southerly route to Wurtsboro Sullivan N82.

kaqw-N82

We were there for the next four days for the wife’s 20th reunion of Up With People. The trip there was a bit nerve racking. We were in the lull between fronts as the low pressure system was spinning over upstate NY. I had to time it well to get out of North Adams and get to Wurtsboro before the next wave of storms. Fortunately my calculations of storm movement and direction were accurate and we made an uneventful, if a bit bumpy trip there.

clouds-sullivan

Route on SkyVector 

Sullivan county is just a few miles off the Hudson river in upstate New York. The premise of the movie Dirty Dancing is based on this area. There are several vacation lodges with cabins and rooms all over the area and we saw many city dwellers dipping their feet in the ponds around the county. We stayed at a really wonderful place with a great bonus. Across the street from the hotel was a coffee shop, with homemade bagels and a microbrewery. It was like a one stop shop for all my vices.

beer-flight-sullivan

While I was there, I ended up taking a short flight around the area with a couple of the other husbands who got dragged to the event. We got fuel at Kingston Ulster 20N and got chatting with the attendant and he let us in on a little secret. There is a fantastic restaurant at Sky Acres airport 44N. So we flew down to sky acres, which happens to be an uphill runway on the side of a hill. It was kind of fun landing there. We got all excited to eat only to find out it was closed during the week and only open on the weekends. Oh well. We did also find out that the company Sky Geek – www.skygeek.com is also based at this airport. All in all it was still a fun day for us and after parking the plane we went and drank beer with other husbands. I put this day in the WIN column!

guys-beer

Our Summer 2014 Trip Part 2

Day 9 July 18th

We flew back from KBOI to Johnson Creek in the evening. It is OK to fly in the morning up until about 1 pm and then again after about 6 pm. I was not aware that just about this time in the evening the winds start shifting from the south to the north. In fact, the wind often gusts during this time.

I got my first taste of a downdraft & tailwind mixed together. While flying the downwind leg for 17 I observed that the windsock was hanging still. So I continued and planned to land runway 17. As we turned final for 17 and descended I had power all the way out to help with descent as I was again a bit high. All of a sudden we got pushed hard from behind and the nose dropped and the plane accelerated. I went max power and started climbing out for a go around. As luck would have it, just about that time the constable from Yellow Pine, the village north of the field, was passing by the field and I guess I flew over his vehicle close enough to make him take notice. He stopped by the airfield to make sure everything was OK.  I ended up landing 35 this time as the wind was strongly out of the north.

Day 11 July 20th

Today we flew from Johnson Creek to Tillamook Oregon via the Columbia River. This route is often called “The Gorge” We departed Johnson Creek early in the morning and used the northern pass out of the mountains and flew over to McCall to refuel. We then headed out to the west going to the Baker City VOR BKE and the northwest to the Klickitat VOR. From this point on I was following the Columbia River. There was a cloud cover at about 1500 feet which required me to drop down below what flight following could track me so we dropped off flight following and went VFR.

columbia-river-mountains-363

We needed fuel and the recommended stop was 4S2 – Ken Jernstedt Airfield Airport in Hood River, Oregon, referred to as “Hood River” by the locals. This was a very challenging landing as the wind was gusting at 20 knots down the runway. I also found out that there is a huge museum right behind this airfield called Western Antique Aeroplane and Automobile Museum that we missed.

hood-river-junction

After we took off and proceeded west, I contacted Portland Approach and we stayed on with them as far as we could. They were really nice and let me cut through the Class C airspace a little north of the airfield and river.

Link To Route on SkyVector

The cloud layer was pretty low at this point and we flew along under it at about 1000 feet using the Columbia River as our guide. Then we emerged out from under the clouds and we were almost to the coast. We flew over Astoria Regional Airport, and then turned south down the coast to Tillamook.

oregon-coast-366

oregon-coastline-367

Sadly the museum did not live up to the hype. The museum owner had some dispute with the aircraft collection owner and he pulled all his airplanes out of the museum. That was really disappointing. There were still some really cool planes, but I did not get to see all the one I hoped to see.

tillamook

tillamook-museum

Flying back was pretty much uneventful. Once clear of the low clouds, we bobbed along at about 9500 most of the way. We did get to see something very unique. There was a forest fire burning north of our route and it looked like a volcano erupting from the middle of the woods.

idaho-forest-fire-368

On this trip we flew to the north side of Johnson Creek and came into the airport through the northern pass, which starts just east of the McCall airport. There is a road which follows the river all the way from the town of McCall into Yellow Pine. I wish I had video as the experience of flying through this route was spectacular, however my camera operator and wife did not enjoy the experience as much as me. We were flying at about 6500 feet with peaks all around us and narrow deep valleys This made her a bit nervous. This landing at Johnson Creek was uneventful. This was out last night at Johnson creek and we planned to depart and start heading east.

Day 12 July 20th
This morning we departed from Johnson creek and flew to Rock springs. It was sad to go as we had such a great time there. The care takers Phil and Roxie are some of the best folks I have ever met. They really do a great job keeping that place clean and running smoothly.

takeoff-3u2-347

I want to mention my death defying takeoff, not to brag but as a learning point. We were 200 lbs under gross weight and the time was 7:30 am and density altitude was 5900. I had to do a combo soft field into a max performance takeoff just to clear the trees. I will admit it frightened me when I had to yank the plane into the air and then ride just above stall at about 50 fpm. Then slowly and I mean ever so slowly creep over to the canyon wall and use the updraft to gain altitude.

Once we got going and I calmed down from the takeoff, the rest of the flight was good. We flew south out of the mountains and made one last stop at Payette S75 for a full tank of that inexpensive mogas. We picked up flight following as we traveled southwest towards Biose and I am glad we did. They were able to help us get around some rain storms on our intended path.

 

kboi-krks-374

Once we got past the rain in Boise the skies were clear and the tail wind was something! I tried to keep an average altitude of 10,500 but that was difficult due to the lumpy air. Air speed indicator showed about 90 knots. The GPS showed a ground speed of about 120 knots! Must admit it was a bit rough going and my arms were tired from fighting the turbulence. We did however, get a nice look at Salt Lake Utah in the distance as we headed back to Rock Springs.

 

salt-lake-utah-370

This time our landing at Rock Springs was on that huge runway 27. I had to keep telling myself that I was at the proper pattern altitude because the runway was so big and it looked too close.

Day 13 July 21st 

We spent the night in Rock Springs at the Holiday Inn Express and then flew out early in the morning. I knew that the takeoff would be a very slow climb so I was prepared for it. I did not use flaps and just let the plane roll down runway 27, it is plenty long, until it was ready to fly and then slowly climbed up. Going east was much easier as there is no steep mountain to clear and a nice open valley to follow along through the mountains. I essentially followed highway 30 all the way out of the rocky mountains.

mbw-panorama-348

 

The Route on SkyVector

Once we exited the rockies, I flew almost due east then turned slightly northeastward to our refuel stop of KANW Ainsworth Regional Airport in Ainsworth, Nebraska. One interesting fact about this little airport is that the FAA has a big office here and I believe several flight service folks work at that office. However I did not see them as they were behind locked doors. The FBO did have a really nice pilot lounge and some really comfy couches, which made for a nice snack break.

kdvn-from-air-349

 

Ainsworth Regional Airport was just about on the edge of the terrain change. Once east of this airport everything started getting greener, and a whole lot closer to sea level. I was able to fly at lower altitudes and get bounced about much less.

From Ainsworth Regional Airport we continued east at a considerable speed, we got a 30 knot tail wind for an average of 140 kts ground speed, to our evening stop of Davenport Municipal Airport in Davenport, Iowa. We did encounter a few light rain storms that forced some course deviations on the way to our next stop. As always I was on flight following and they were a big help. They even asked me for a few PIREPS as we passed close to a few of the storms. I thought it was cool that many other pilots got to benefit from my sharing about the storms, their intensity, tops and bottoms and the air conditions around them.

The Route on SkyVector

The landing at Davenport was a bit of a challenge. There were storms in the areas and the wind was a right quartering cross-wind gusting at about 17 knots. I really had to work hard to get the plane down safely on the runway.

flightaware-kdvn1-346

 

I had my right foot all the way down on the right rudder and it was still not quite enough to keep us straight. I added 8 knots, half the speed of the crosswind, to my airspeed so I was landing at about 78-80 knots. With that extra speed I had to work a bit harder to get the plane down to the runway. In the video you will notice my hand on the dashboard. That was to help keep me steady as I fought the crosswind. After landing you will notice that we get pushed to the left and I have to steer the plane back to the center.

The folks at Davenport were super nice. Since I bought gas and planned to spend the night, they loaned us a crew car to take to our hotel. This car was a couple of notches above the average courtesy car we normally use. This was a super nice, late model American sedan with plush leather seats and super good A/C. Another amazing bonus was a Cracker Barrel restaurant right across the parking lot from our hotel. As luck would have it we had a gift card for Cracker Barrel and feasted like kings!

Our Summer 2014 Trip Part 1

Planning the Route

Much planning and discussion of the route for this trip took place for weeks before the actual event. I discussed the route with several other pilots and finally came up with what was considered a safe and survivable route.

The numbers (Best Estimates)

Miles traveled: 6600
Hours of flying: 68
Gallons of Fuel: 680

The Entire Route on Skyvector (Best Guess)

The Entire Route
A Few Words on Back-Country Flying

My original flight training took place in the western mountains of Massachusetts, so I had some relevant experience with flying around and understanding the weather phenomena that occur in mountainous terrain. I did however recognize, that the mountains were much higher and the risk was greater in the Idaho back country. I devoted quite a bit of time to reviewing everything I could find about flying into the Idaho back country and more specifically, Johnson Creek airport. I read pamphlets, web articles and watched many videos on the subject. I also started obsessing about the weather about two weeks before we departed so I knew the weather patterns well.

One video that was particularly well done and very informative is this one by Greg Swingle:

I feel these factors, good route planning, studying up on back country flying, and knowing the weather patterns were essential to the success of the trip. Having a backup plan was also very useful. I had a plan A, B, C and sometimes D for each leg of the trip.
Day 1 July 10th

We loaded the plane to within 60 lbs of gross and headed off northwestward. The basic plan for getting to Johnson Creek was to fly four hours, refuel and then four more hours before stopping for the night. Out first planned stop was M83 McCharren field for a quick refuel and stretch break. I picked up flight following just after lift off. For the next four hours they were a constant companion in my headset.

Leeward to MCharren

Link to Route on Skyvector

The flight went well until we got close to our destination. I was turning east, then west, ascending and descending to avoid the rain clouds that popped up. About fifty miles from our planned stop at McCharren, I realized we were getting low on fuel. When I state that we were low, we were down to the FAA safe minimum of 30 minutes remaining fuel. I changed plans and went to the closer planned alternate of KUBS Columbus-Lowndes County Airport in Mississippi.

KUBS was a nice stop. The FBO was in the midst of a face lift and a crew was working on the interior. The bathrooms were clean and in good shape and they did have a nice little pilot lounge in the making. Russ pumping fuel at KUBS After refuel we took off and had to deal with the Class C airspace right next door. What a pain! instead of letting me turn west then north I had to turn south then east then north and go all the way around. Flying Around Columbus Quite frustrating but I understood afterward that had I gone east then north I would have ended up right in the middle of the approach path for incoming flights. We finally got back on track with flight following and continued to KMIO Miami Municipal Airport located in Miami Oklahoma. Quick note Miami is pronounced “my-am-a” at this particular location as it is a Native American word.

Link To The Route on Skyvector

Had a few moments of excitement on the way in to KMIO. Encountered some moderate rain and approach guided me around the worst of it. Went through about 4 miles of downpour that brought visibility way down. Rain over KMIO

Oh! The best part. The runway at KMIO is closed for repairs. I saw the NOTAM during my morning flight brief on DUATS and called the airport to confirm. The airport manager let me know there was no issue and he instructed me to land on the taxiway. No problem!

NOTAM: MLC 08/194 MIO RWY 17/35 CLSD PARL TWY 4000FT X 50FT AVBL SR-SS VMC 1408121402-1408261400

What a wonderful place. There is one of the longest restored sections of Route 66 running right through the middle of this town. When I called ahead about the runway I also asked about a courtesy car and the manager arranged one for us. Our ride was a big old Ford Crown Victoria that was a retired police car. Miami Route 66It swayed and rattled but ran just great and got us to dinner and our hotel just fine. We had a great dinner at Montana Mike’s and then cruised Historic Route 66 in the courtesy car. While on Route 66 we saw a bunch of restored landmarks including an old gas station, a theater and several shops along the way.

 

Day 2 July 11

Departed KMIO picked up flight following and flew to our planned refuel stop, Imperial Municipal Airport KIML located in Imperial, Nebraska. We got a great tailwind, which is unusual when traveling west. crop-duster-357 The airport itself was located in the middle of acres of corn fields. The runway seemed to be made of a series of cement tiles connected together, which made my tires click out a tune as we landed. I picked this airport because it was on our route and had some of the least expensive 100 octane low-lead fuel, commonly called 100LL for the plane. Our Piper Cherokee burns about 8 to 10 gallons of fuel in an hour and goes through 40 gallons every four hours of flight. Finding less expensive spots to stop for fuel really saved us some cash.

Link To The Route on Skyvector

The next leg took us from KIML to KRKS Rock Springs – Sweetwater County Airport in Rock Springs, Wyoming. This was the big moment where we flew up to and then across the Colorado rocky mountains. I had thought about this moment quite a bit as I would be flying at 10,000 feet, yet I would only be a few thousand feet above the ground as the peak elevations here would be around 8000 to 9000 feet. Fortunately the northern route afforded us plenty of vertical clearance and an wonderfully scenic flight path.

SkyVector Imperial to Rock Springs

While on flight following we got treated to ATC rerouting several flights due to heavy precipitation over Denver. We stuck to our flight plan of MBW to CKW to OCS which kept us out of the rain. Here is a pic of the storm over Denver. It was enormous.

storm-over-denver-359

The ride into the mountains was turbulent. I was stuck below a near solid cloud layer at about 10,000 feet and bumped along all the way to KRKS. ATC lost us on Radar for a while between Medicine Bow and Cherokee, but picked us back up as we got closer to Rock Springs. Interesting note about the runways at KRKS the 09\27 runway is huge! (10,000 x 150 feet) I hesitated for a moment and had to check the sectional to confirm the runway altitude to set my pattern altitude.

Link To The Route on Skyvector:

The hospitality here was great we were warmly welcomed and chatted it up with the locals about flying conditions on the way in. Got a ride from some nice folks to our hotel. The following day we planned our travel to S75 Payette.

russ-anna-krks-360
Day Three July 12th

Rock Springs to Payette Took off from KRKS Rock Springs early morning and had a very slow climb out. Density Altitude (DA) was already 2000+ at 8:00 am. The field is 6765 feet so DA was 8765+ feet at takeoff. With full fuel and 60 lbs under max weight, we rolled down the runway almost 5000 feet and then climbed ever so slowly at 50-100 feet per minute (fpm). At sea level a normal climb for this plane even fully loaded is around 500-600 fpm. This poor climbing performance made me a bit nervous because the mountain ridge ahead of us was about 8300 feet high. Slowly got to 8500 to clear the mountain ridge, then lowered the nose and flight-climbed to 10,500 feet.

One part of the flight that was particularly scenic was the trip over Bear Lake.and through Logan Pass. We flew west towards and then over Bear Lake to the lakefront town of Garden City.

East Over Bear Lake

Bear lake South

Once over the lake, we then followed the highway past Garden City to the west through the mountain pass into Logan Utah. Really something to see as we approached Logan and the mountain pass made a dramatic drop off a few thousand feet down into the valley where the city of Logan resides.

Logan Pass

Link To The Route On Skyvector:

We continued our flight over the BYI to BOI VORs and then started our decent to S75 – Payette Municipal Airport in Payette, Idaho. Payette is a great little airport with friendly folks. They had the best price on Avgas, also offered 91 octane Mogas, and had a nice courtesy car. I made a bit of a bumpy landing coming in from the south as the rough terrain around the airport produced some turbulence.

I was a little worried about the climb out in the morning. We planned to take off early to the north for that slow climb up to altitude. The following day was the arrival at Johnson Creek.
Day Four July 13th

We set out early from our hotel, drove the courtesy car back to the airport and departed from Payette. Actually had no problem at all taking off as Payette is 2228 ASL and the morning air in the valley was cool with no wind.

Payette to Johnson Creek

For our first landing at Johnson Creek Airport I chose the southern approach. That means I flew southeast from Payette, over Emmett field – S78, turned northeastward to horseshoe bend, then north along the railroad track up to Cascade airport. After that I turned northeastward and followed a road through the notch in the mountain range until I intercepted Johnson Creek. From there I turned north and followed the valley right into Johnson Creek Airport.

 

The Route on Skyvector:

A brief note on landing at Johnson Creek. The preferred landing runway is 17 as the runway has a 10 degree upwards slope to the south. The canyon walls top out at about 8900 feet ASL but the runway is 4933 ASL. That means you have to fly the pattern at about 5900 feet very close to the runway. When approaching from the south you have to fly the downwind leg very close to the runway and very close to the trees in the canyon. For a first-timer this was intimidating! I found myself watching the trees and the altimeter and struggling to find the altitude that felt safe.

The landing pattern cannot be squared out like we might do back at home. I flew the plane north bound past the field to what is designated as the widest part of the canyon. At this point did a descending 180 degree turn to the left as I added flaps. This did not allow me to descend enough and I was high on final. This is a common rookie mistake. When I was abeam the numbers I should have been a maximum of 800 feet above the runway, however those trees looked really close at that altitude and I opted to stay a bit higher. To make the landing I did a side slip and held it almost all the way to the runway. My first landing was a bit steep and fast, but the runway is 3400 long so I had plenty of space to let the plane roll and burn off speed.

A couple of other points to consider. I was moving way faster in that thin air. Although my airspeed indicator said 70 knots it was in reality closer to 80 knots over the ground. Things happened fast. The direction of the wind through the canyon that day also created an updraft on my downwind leg. I was reducing power and still climbing. I had to push the nose down and reduce power quite a bit to get down to pattern altitude.

Once we landed at Johnson Creek. We camped for the next week. The daytime temperature was almost 100 degrees, at night the temperature would descend to about 45 degrees. This made for some cold nights. The good news is that there is a shed full of extra camping gear left behind and donated by other campers so we were able to find some extra sleeping bags!
The scenery here is breathtaking. The mountains are tall and steep and lush with green. The valleys are spectacular as they are full of life and beauty

Johnson Creek Panorama

Our campsite next to the river with all the comforts of home.

Our campsite at 3U2
Day 8 July 17th

On this day we went to Sulfur Creek ID74, pronounced “crick”, for the famous $20 breakfast. Had sulfur-creek-menua bit of haze during the flight in that we later discovered was smoke. The breakfast menu is very simple. Yes or No. The food was fantastic. The breakfast consisted of eggs, pancakes sausage, home-made apple butt and piping hot coffee. We were able to sit outside and watch other planes come in and land on the runway.

 

My landing was a bit of an adventure as I think I hit a gopher on landing. Look and see:

Sulphur Creek is a fly-in resort with lodge and cabins. The price to stay there was a bit steep, but you do get all your meals included plus a warm comfy bed in a cabin. After freezing a few nights I was tempted to plunk down the cash for a warm bed!

Sulphur Creek Airstrip
Remember when I mentioned back on day 1 where we flew through some precipitation near KMIO? Well, it seems that precipitation included some very abrasive droplets. The propeller on the plane was stripped nearly clean of all paint on the back side. I made arrangements with a prop shop in Boise Idaho to fix my prop and re-balance it.

On the flight south from Sulfur Creek we flew to KBOI Boise Air Terminal/Gowen Field Airport to get the prop re-balanced. As we traveled south towards the airport the valleys were filled in with smoke and only the very peaks of the mountains were visible.

Smoke in the Moutains

It was a strange experience to see the mountains this way, as only a few days earlier we saw them quite clearly. This also significantly reduced visibility. As I got closer to Boise I estimate I could see about 3 miles in the smoke, sometimes it was even less.

More Smoke in the Mountains

As I got about 20 nm from KBOI, I radioed approach to gain access to the Class C airspace. I was denied and was told to hold outside the airspace. I ended up flying a little racetrack pattern for about fifteen minutes while I waited. I was then granted svfr access and given a specific altitude and vector to fly. I did not see the actual runway until it was very close. Approach guided me to a left base for runway 10R and then turned me over to the tower.

The local prop shop kept my plane overnight and the wife and I spent the night in a local hotel. I slept almost 12 hours as sleep out there in the woods was not as warm and restful as I had hoped.

My Basic Toolkit

I wanted to create a small toolkit for my airplane.  Nothing too fancy here.  Just a simple tool bag that would cover the basic repairs I might encounter.

At the top is the tool bag bought at Big-Lots for $5.99

Top row left to right

  • Digital Volt/Ohm meter
  • Soldering Iron & Solder
  • Lengths of wire
  • Super glue gel (resealable)
  • Twine
  • Electrical tape
  • Ratchet screwdriver & bits in container
  • Small hex head driver
  • Box cutters
  • Medium hex-head driver
  • Medium flat-head driver
  • LCD Flashlight
  • 3/8 socket set with extension
  • Sockets: 1/4-3/4
  • Spark Plug Socket

Second Row Left to Right

  • Twist Pliers
  • .032 Safety Wire (Aviation Grade)
  • Diagonal cutters
  • Mix of common bulbs, screws, fasteners in container
  • 1/2 dozen different size zip ties
  • Hex head wrenches: .050-3/8
  • Wrenches: 9/16 – 3/8
  • Mini Vise-grips
  • Small adjustable wrench

Capturing Organizational Knowledge

Capturing Organizational Knowledge:

Approaches to Knowledge Management and Supporting Technology

By

Russ Wright

Knowledge Management

Although it is said that money makes the world go around, the use of knowledge has displaced money as the primary business driver because, according to Drucker (1988), organizations discovered that organizational knowledge is the most useful tool to use to gain a competitive advantage. Identifying and leveraging knowledge held within the individual and the organization was used to increase their competitiveness (Baird, Henderson, & Watts, 1997). Over the past three decades information and the technology to support it, has grown at an explosive rate and the wealth of information available has rapidly advanced in many fields, including electronics, computers and communications technology (Adomavicius, Bockstedt, Gupta, & Kauffman, 2008). Knowledge management is the inevitable result of rapid progress in Information Technology (IT), globalization and rising awareness of the commercial value of organizational knowledge (Prusak, 2001). The existence of all this information forces organizations to find a way to handle it and transform it into actionable knowledge. Thus the problem exists not only in interpreting, distilling and sharing the information, but also efficiently turning it into knowledge.

The purpose of this document is to explore how knowledge has become the most important resource for an organization and learning is the most important capability for an organization that wants to compete in the marketplace. There is a discussion of the background on creating a competitive advantage and the importance of learning within an organization. This document also compares and contrasts the major approaches to knowledge management within an organization and examines the role that computer technology plays in capturing organizational knowledge. The conclusion finds that the field of knowledge management is still evolving and Web 2.0 technology might change the way knowledge is captured within an organization.

Background

Knowledge for A Competitive Advantage

The realization that knowledge, when organized and viewed through the lens of competitive factors, could help an organization gain a competitive advantage, formalized the beginning of knowledge management. Porter (1980) explained that the existing model of developing a business strategy was no longer working. He created a new model that brought together the ideas of the Harvard Business School and the Boston Consulting Group and created a business strategy commonly called the five forces model as displayed in figure 1 below. This model used five factors of competition as a basis for a business strategy: (1) industry competitors, (2) pressure from substitute products, (3) bargaining power of suppliers, (4) bargaining power of buyers, and (5) potential entrants. The author explained that analysis of these five areas allowed a business within a particular industry to establish themselves and react to these forces of competition and profit from them. Albers and Brewer (2003) explained that examining each of these five forces required specific knowledge within that particular competitive factor. Accordingly, knowledge management began with the need to understand the complexities of each of the five factors. Yet, the knowledge alone was not enough, as organizations had to learn from the analysis of the five factors and adapt to the ever-changing market.

Figure 1

Porter’s Five Forces Model

porter-five-forces

The Learning Organization

Possessing knowledge of competitive factors is not enough of a business strategy to make an organization competitive and profitable. Instead the organization must adapt and take advantage of opportunities to remain competitive because learning is an organization’s most important capability (Earl, 2001; Grant, 1996; Zack, 1999a). Nonaka (1991) explained that learning must be integrated into the culture of the organization and not a separate activity performed by specialists. Senge (1994) described a learning organization as a place where “people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning how to learn together” (p. 1). Thus, a learning organization embraces a culture where the ability to create and share new knowledge will give it a competitive advantage. Still, defining and implementing the required skills for an organization to embrace learning is complicated process.

Creating and sustaining a learning or knowledge-creating culture requires an organization to not only engage in specific activities, but also develop a new mindset. Argyris and Schon (1978) theorized that learning involved not only detecting the error and correcting it, but changing the way the organization behaves as a whole through policy change. Garvin (1993) built upon this work and created a set of activities in which learning organizations must engage to sustain a knowledge-creating culture. The author defined these activities as: (1) systematic problem solving, (2) experimentation with new approaches, (3) learning from their own experience and past history, (4) learning from the experiences and best practices of others, and (5) transferring knowledge quickly and efficiently throughout the organization. The author further explained that applying these practices is not enough, as real change must include an analysis beyond the obvious by delving into the underlying factors. Inasmuch, both learning activities and a culture change are needed to create a learning culture.

All the aforementioned work led to the creation of a field of research commonly called knowledge management. According to Alavi and Leidner (2001), knowledge management is made up of several somewhat overlapping practices that an organization can use to find, create and share knowledge that exists within individuals or processes of an organization. Another view of knowledge management defined it as “an institutional systematic effort to capitalize on the cumulative knowledge that an organization has” (Serban & Jing Luan, 2002, p. 5). Consequently, knowledge management is deeply connected to the people, procedures, and technology within an organization.

Approaches to Knowledge Management

There are different views on the definition of knowledge, which lead to the creation of multiple models of knowledge management (Carlsson, 2003). The first series of approaches to knowledge management, define knowledge as a “Justified True Belief” (Allix, 2003, p. 1). These models place knowledge into different categories (Boisot, 1987; Nonaka & Takeuchi, 1995). The second more scientific type of knowledge management model views knowledge as an asset and connects the value to intellectual capital (Wiig, 1997). The third type of knowledge management views knowledge as subjective and the model focuses on the creation of knowledge within the organization (Demarest, 1997). Which model an organization chooses to use depends upon the organization’s strategic needs (Aliaga, 2000). Thus, there exist many different models of knowledge management for many different needs.

Categorized Knowledge Management

One of the earliest knowledge management models created by Boisot (1987) categorized knowledge into four basic groups as demonstrated in Table 1 below. The first group was codified knowledge which encompassed knowledge that could be packaged for transmission and could have two states, diffused or undiffused. Codified-diffused knowledge was considered public information. Codified-undiffused knowledge was private or proprietary information shared with only a select few. Uncodified knowledge was knowledge that is difficult to package for transmission. Uncodified-undiffused knowledge was personal knowledge. Uncodified-diffused was considered common sense knowledge. The author explained that common sense knowledge develops through the social interactions where individuals share their personal knowledge. The author also pointed out that codified and uncodified are unique categories of knowledge.

Table 1:

Boisot’s Knowledge Category Model

Uncodified Common Sense Personal Knowledge
Codified Public Knowledge Proprietary Knowledge
Diffused Undiffused

 

The knowledge management model created by Nonaka and Takeuchi (1995) defined two forms of knowledge in their model: tacit and explicit knowledge. They explained that explicit knowledge is knowledge that is shared in some way and gathered into some storage device, such as a book or computer program, which makes it easy to share with others. Tacit knowledge was explained as internal to a person, somewhat subjective, useful only in a specific context, and difficult to share, as it exists only within the mind of the individual. The authors explained that tacit knowledge could be shared through socialization: social interactions, either face to face or in a shared group experience by members of an organization. This knowledge could become explicit knowledge throughexternalization when it is formalized into the information systems of the organization. Explicit knowledge can then be compiled and mixed with other existing knowledge through a process callcombination. Likewise, explicit knowledge could become tacit through a process of internalization, which happens, for example, when members of the organization are trained on how to use a system. When all of these different modes of knowledge transfer work together they create learning in what the author calls “the spiral of knowledge” (p. 165). Through iterations of learning starting at the individual, and spiraling up into the group and eventually the organization, knowledge accumulates and grows which leads to innovation and learning (Nonaka, 1991).

Table 2:

Nonaka’s Knowledge Management Model

Tacit to Tacit → Socialization Tacit to Explicit → Externalization
Explicit to Tacit → Internalization Explicit to Explicit → Combination

 

When comparing and contrasting these two categorical models, it is easy to see some similarities. The tacit and explicit categories from Nonaka (1991) are somewhat similar to the codified and uncodified knowledge categories defined by Boisot (1987). Another similarity is that tacit and explicit, codified and uncodified knowledge categories are considered unique by both authors. Also, both authors mentioned that their models include a sharing of knowledge, which moves knowledge from the person to the larger group. One place the two models differ greatly is that Nonaka (1991) is much more explicit about the idea of collecting knowledge and creating new knowledge through the knowledge spiral. McAdam and McCreedy (2000) criticized these models as too mechanistic and explained that they lacked a holistic view of knowledge management.

Intellectual Capital as Knowledge Management

The Skandia firm developed a scientific model of knowledge management to help measure their intellectual capital. According to Wiig (1997), this tree-like model treats knowledge as a product that can be considered an asset to the organization. The knowledge or intellectual capital has several categories: (1) human, (2) structural, (3) customer, (4) organizational, (5) process, (6) innovation, (7) intellectual and (8) intangible. The value assigned to each of these categories tells the organization their future capabilities. The author defined each of the categories as:

  • Human capital is the level of competency for the employees.
    Structural capital is the collection of all intellectual activities of the employees.
    Customer capital is the value of the organization’s relationships with their customers.
    Organizational capital is the knowledge embedded in processes.
    Process capital is the value creating processes.
    Innovation capital is the explicit knowledge and inscrutable knowledge assets.
    Intellectual property is documented and captured knowledge.
    Intangible assets are the value of immeasurable, but important items.

     

When comparing the Skandia model against the Nonaka and Takeuchi (1995) model, it is possible to see the two models share the concept of explicit knowledge, defined as innovation capital within the Skandia model. Although the concept of tacit knowledge is not directly mentioned, Wiig (1997) explained that tacit knowledge that is transferred to explicit to be of lasting value compares to customer capital being transferred to innovation capital in the Skandia model. A study by Grant (1996) criticized the Nonaka and Takeuchi (1995) model because it was based in the context of new product development, whereas the Skandia model would also work for existing products.

Social Construction Model

The last model of knowledge management presented here was created by Demerest (1997) and focused on the creation of knowledge within the context of the organization. The author contends that all organizations have a knowledge economy and in general operate in about the same way. This includes an understanding that commercial knowledge is not truth; instead it is what works for the situation to produce knowledge that leads to an economic gain. One of the primary assumptions of this model is that the knowledge creation process happens through interactions between members of the organization. The author borrowed several concepts and adapted a model created by Clark and Staunton (1989) which includes the four following phases: (1) construction, (2) dissemination, (3) use, and (4) embodiment. The author defined construction as discovering or structuring some knowledge, embodiment as the process of selecting a container for the new knowledge, dissemination as the sharing of this new knowledge and use as creation of commercial value from the new knowledge. The author further explained that process could flow through all four steps in sequence, or happen simultaneously along a few different paths such as construction to use and construction to dissemination.

Figure 2

The Demarest Knowledge Management Model

demarest-km-model

The social construction model seems to be the best of all the other models. When comparing and contrasting this social construction model to the categorical models of Nonaka and Takeuichi (1995) and the Boisot (1987) model, they share (1) the concept of knowledge creation as powered by the flow of information within the organization, (2) the concept that knowledge creation happens between the members of the organization, and (3) include a sharing of knowledge which moves knowledge from the person to the larger group. According to McAdam and McCreedy (2000), this model differed from the categorical and intellectual capital models because the author included the idea that knowledge is inherently connected to the social and learning processes within the organization, and “knowledge construction is not limited to scientific inputs but includes the social construction of knowledge” (p. 6). Therefore this model brings together the best parts of all the other models.

These knowledge management models span a wide range of perspectives on the definition of knowledge management. The categorical models shared the concept of tacit and explicit knowledge. The intellectual capital model considered knowledge as an asset to be managed efficiently to make an organization successful. The social construction model linked knowledge to the social interactions and learning processes within the organization. The progression of models demonstrates that knowledge management continues to evolve. Grover and Davenport (2001) explained that the main purpose of knowledge management models is to help an organization grow their knowledge base and increase their competitive edge in the marketplace. Thus no one model is best to help an organization grow, but instead depends on the perception of the definition of knowledge.

The Role of Computer Technology in Knowledge Management

All the attention on knowledge management has lead to increased use of Information Technology (IT) to capture knowledge. Splender and Scherer (2007) explained that “the majority of KM consultants and business people see IT as KM’s principal armamentarium-it is all about collecting, manipulating, and delivering the increasing amounts of information ITs falling costs have made available” (p. 5). This opinion seems to resonate with Zack (1999b) who proposed a knowledge management strategy for transferring tacit knowledge to a storage format, thereby making it explicit. The author explained that this conversion process is commonly called codified knowledge. He also explained that this model uses IT as a pipeline to connect people to knowledge. Hansen, Nohria and Tierney (1999) proposed an additional strategy of knowledge management architecture that focused on dialog between individuals, thereby sharing tacit to tacit knowledge, which they called personalization. According to the authors this model uses IT to connect people to people and exchange tacit knowledge. Thus, the use of information technology to capture knowledge varies based on the organization’s competitive strategy.

The codified knowledge strategy according to Zack (1999b) is designed to capture knowledge, refine it into something usable, and then place it into a storage device, such as a document repository, where it is reusable by other members of the organization. The ability to store and reuse the knowledge whenever needed creates an economy of reuse, which helps to prevent the constant recreation of knowledge and therefore reduce costs (Cowan & Foray, 1997). According to Hansen et al. (1999), this knowledge strategy, which they call people-to-documents, comes with a significant investment cost for IT because of the need to sort and store large amounts of knowledge, now in data form.

Hansen et al. (1999) defined the personalization knowledge strategy as drawing on the relationships established between individuals in an organization wherein they share tacit knowledge. They further explained that this strategy created an economy of experts within the organization, which they called people-to-people. In contrast they explained that this model required a much smaller investment in IT infrastructure as much less knowledge is stored in any digital format, but instead stays in the minds of the employees.

Managerial Needs

Managers within an organization with a knowledge management strategy need different types of information about the technology used for the knowledge management system. According to research by Jennex and Olfman (2008) managers required multiple factors of measure to gauge the effectiveness and success of a knowledge management system. The managers needed to know about the information quality in the system, how well the users were adapting to using the software, and the overall performance of the knowledge management system. Massey Montoya-Weiss, and O’Driscoll (2002) explained that managers needed information not only about what is in the system, but how well the system was performing so they could assist in removing bottlenecks. Consequently, the information needed by managers, not only gauges the effectiveness of the knowledge management system, but also helps to make the system function smoothly.

Pitfalls

Using information technology to capture the knowledge of an organization might be detrimental if done improperly. In a paper by Johannessen, Olaisen and Olson (2001), the authors expressed much concern over the misuse of information technology to manage tacit knowledge within an organization. They argued that despite the empirical evidence to the contrary organizations continued to invest in IT systems that may lead to a loss of or at least a diminishing of the importance of tacit knowledge. Zack (1999b) explained that competitive performance requires a balance between tacit and explicit knowledge. Nonaka (1994) explained that knowledge within an organization was created by, and flows from the members of an organization engaging each other and sharing tacit and explicit knowledge. Scheepers, Venkitachalam and Giibs (2004) extended the research of Hansen et al. (1999) and concluded that an 80/20 mix of codification and personalization strategy, based on the competitive strategy of the organization was most successful. For these reasons, a balance between tacit and explicit knowledge must be maintained in the organization’s culture and IT infrastructure.

Web 2.0

The new technologies created in the Web 2.0 culture offer some new IT solutions to knowledge management. Web 2.0 technology functions more like the way individuals interact (O’Reilly, 2006). As previously stated, the codified knowledge strategy requires a significant IT investment not only in equipment but also specialists to gather and organize the knowledge (Hansen et al., 1999). According to Liebowitz (1999) one of the problems with traditional knowledge management technology is that it put the user in the role of passive receiver. Tredinnik (2006) in reference to Web 2.0 technology in knowledge management explained: “The technologies involved place a greater emphasis on the contributions of users in creating and organizing information than traditional information organization and retrieval approaches.”(p. 231). Chu, Miller and Roberts (2009) echoed this same concept when they explained that Web 2.0 technology puts the emphasis on the users generating new information or editing other participant’s work. According to Levy (2009) one of the advantages of Web 2.0 technology is that as individuals shared the knowledge they potentially assisted in the codification process. When they shared their tacit knowledge by posting it in an interactive Web 2.0 tool, such as a wiki, the knowledge began to move to explicit as others read, enhanced and categorized this knowledge, which moved it up from personal to organizational knowledge. Accordingly, web 2.0 technologies potentially offers many benefits, among them are more user participation creating more knowledge sharing, which helps keep the knowledge from becoming stale and lower costs as participants do more of the work.

Information technology for knowledge management and specifically capturing organization knowledge depends on the organization’s competitive strategy. The two strategies outlined here, codified and personalization knowledge strategy use information technology in different ways because the former builds a repository requiring significant IT investment and the latter creates a loose network of experts which requires a smaller IT investment. The experts warn that technology itself is not the answer and a real strategy with clear plans needs to be in place or the technology investment will not help the knowledge management process. There are some new Web 2.0 technologies on the horizon that could positively impact user participation in knowledge management technology and save money when codifying knowledge.

Knowledge Management Is Still Evolving

The approaches to knowledge management outlined here show a progression of thought. The models show a progression from a specific portion of knowledge sharing and an absolute definition of knowledge as truth (Boisot, 1987; Nonaka, 1991), to a wider and more generic perspective of sharing knowledge and a more subjective definition of commercial knowledge as truth (Demarest, 1997). Also, the information technology used to support the knowledge management strategy continues to evolve. Both the codified knowledge strategy of Zack (1999b) and the personalization knowledge strategy of Hansen et al. (1999) require technology to capture and share the knowledge. New Web 2.0 technology possesses a potential to change how much of an investment in technology the organization must make, as this technology is likely to increase the level of participation of the users making them more active in the knowledge management process. Therefore the only certainty of knowledge management is the continued growth and change of the models, strategies and technology.

 

References

Adomavicius, G., Bockstedt, J. C., Gupta, A., & Kauffman, R. J. (2008). Making sense of technology trends in the information technology landscape: A design science approach. MIS Quarterly,32(4), 779-809.

 

Alavi, M., & Leidner, D. E. (2001). Review: Knowledge management and knowledge management systems: Conceptual foundations and research issues. MIS Quarterly, 25(1), 107-136.

 

Albers, J., & Brewer, S. (2003). Knowledge management and the innovation process: The eco-innovation model. Journal of Knowledge Management Practice, 4(1).

 

Aliaga, O. A. (2000). Knowledge management and strategic planning. Advances in Developing Human Resources, 2(1), 91-104. doi:10.1177/152342230000200108

 

Allix, N. (2003). Epistemology and knowledge management concepts and practices. Journal of Knowledge Management Practice, 4(1), 136-152.

 

Argyris, C., & Schön, D. (1978). Organizational learning: A theory of action perspective. Reading, MA: Addison Wesley.

 

Baird, L., Henderson, J., & Watts, S. (1997). Learning from action: An analysis of the center for army lessons learned. Human Resource Management Journal, 36(4), 385-396.

 

Boisot, M. (1987). Information and organizations: The manager as anthropologist. London, UK: Fontana/Collins.

 

Carlsson, S. (2003). Knowledge managing and knowledge management systems in inter-organizational networks. Knowledge and Process Management, 10(3), 194-206. doi:10.1002/kpm.179

 

Chui, M., Miller, A., & Roberts, R. P. (2009). Six ways to make Web 2.0 work. The McKinsey Quarterly, 1-7.

 

Clark, P., & Staunton, N. (1989). Innovation in technology and organization. London, UK: Routleedge.

 

Cowan, R., & Foray, D. (1997). The economics of codification and the diffusion of knowledge.Industrial and Corporate Change, 6(3).

 

Demarest, M. (1997). Understanding knowledge management. Long Range Planning, 30(3), 374-384. doi:10.1016/S0024-6301(97)90250-8

 

Drucker, P. F. (1988). The coming of the new organization. Harvard Business Review, 66(1), 45-53.

 

Earl, M. (2001). Knowledge management strategies: Toward a taxonomy. Journal of Management Information Systems, 18(1), 215-233.

 

Garvin, D. A. (1993). Building a learning organization. Harvard Business Review, 71(4), 78-91.

 

Grant, R. (1996). Prospering in dynamically-competitive environments: Organizational capability as knowledge integration. Organization Science, 7(4), 375-387.

 

Grover, V., & Davenport, T. H. (2001). General perspectives on knowledge management: Fostering a research agenda. Journal of Management Information Systems, 18(1), 5-21.

 

Hansen, M. T., Nohria, N., & Tierney, T. (1999). What’s your strategy for managing knowledge?Harvard Business Review, 77(2), 106-116.

 

Jennex, M. E., & Olfman, L. (2008). A model of knowledge management success. In Current Issues in Knowledge Management (pp. 34-52). Hershey, PA: Information Science Reference.

 

Johannessen, J. (2001). Mismanagement of tacit knowledge: The importance of tacit knowledge, the danger of information technology, and what to do about it. International Journal of Information Management, 21(1), 3-20. doi:10.1016/S0268-4012(00)00047-5

 

Levy, M. (2009). WEB 2.0 implications on knowledge management. Journal of Knowledge Management,13(1), 120-134. doi:10.1108/13673270910931215

 

Liebowitz, J. (1999). Key ingredients to the success of an organization’s knowledge management strategy. Knowledge and Process Management, 6(1), 37-40.

 

Massey, A. P., Montoya-Weiss, M. M., & O’Driscoll, T. M. (2002). Knowledge management in pursuit of performance: Insights from nortel networks. MIS Quarterly, 26(3), 269-289.

 

McAdam, R., & McCreedy, S. (2000). A critique of knowledge management: Using a local constructionist model. New Technology, Work & Employment, 15(2), 155.

 

Nonaka, I. (1991). The knowledge-creating company. Harvard Business Review, 85(7/8), 162-171.

 

Nonaka, I. (1994). A dynamic theory of organizational knowledge creation. Organization Science, 5(1), 14-37.

 

Nonaka, I., & Takeuchi, K. (1995). The knowledge creating company: How Japanese companies create the dynamics of innovation. Oxford, UK: Oxford University Press.

 

O’Reilly, T. (2006). Web 2.0 compact definition: Trying again. O’Reilly radar. Retrieved January 24, 2011, from http://radar.oreilly.com/2006/12/web-20-compact-definition-tryi.html

 

Porter, M. (1980). Competitive strategy: Techniques for analyzing industries and competitors. New York: Free Press.

 

Prusak, L. (2001). Where did knowledge management come from? IBM Systems Journal, 40(4), 1002-1007.

 

Scheepers, R., Venkitachalam, K., & Gibbs, M. (2004). Knowledge strategy in organizations: refining the model of Hansen, Nohria and Tierney. The Journal of Strategic Information Systems, 13(3), 201-222. doi:10.1016/j.jsis.2004.08.003

 

Senge, P. (1994). The fifth discipline: the art and practice of the learning organization (1st ed.). New York: Doubleday/Currency.

 

Serban, A. M., & Jing Luan. (2002). Overview of knowledge management. New Directions for Institutional Research, 2002(113), 5.

 

Spender, J., & Scherer, A. (2007). The philosophical foundations of knowledge management: Editors’ introduction. Organization, 14(1), 5-28.

 

Tredinnick, L. (2006). Web 2.0 and business: A pointer to the intranets of the future? Business Information Review, 23, 228-234.

 

Wiig, K. (1997). Integrating intellectual capital and knowledge management. Long Range Planning,30(3), 399-405. doi:10.1016/S0024-6301(97)90256-9

 

Zack, M. H. (1999a). Developing a knowledge strategy. California Management Review, 41(3), 125-145.

 

Zack, M. H. (1999b). Managing codified knowledge. Sloan Management Review, 40(4), 45-58.

My Favorite Yum Plugin

Yum – The YellowDog Updater Modifier is one of the best tools in Fedora Linux. Yum is a command line tool that allows the user to add or remove packages (programs) from the linux os installation. Yum has some great plugin-ins. My two favorites are:

  • Presto
  • Fastest Mirror

Presto will try to download only the updated portion of the program and save time. Fastest Mirror will speed up your downloads by finding the fastest source near you.

To install these options, log in as root, then type:

#yum install yum-plugin-fastestmirror

#yum install yum presto

Ubuntu 11 and Skype

When I upgraded to Ubuntu 11 on my Dell Optiplex 755, I noticed that skype would not work. I use skype quite a bit so I needed to find a solution. I have a Microsoft Lifecam vx700. This is a usb camera and microphone combo. No matter how I tried the camera would not work with skype. Here is my solution gleaned from dozens of other sites. I take no credit for the solution below, I merely brought it all together into one post from bits a peices I found in different posts on the skype and unbuntu forums. One note, I use Ubuntu in classic mode so my directions reflect that gui mode.

Step 1. Open terminal (Gnome Menu-> Accessories->terminal) and use gedit to create a shell script in usr/bin. On the command line copy and paste the following: (minus the pound sign #)

#gksudo gedit /usr/bin/skype.sh

You will be prompted for your user password. Once the text editor opens, copy and paste the lines below into the editor:

#!/bin/bash 
export XLIB_SKIP_ARGB_VISUALS=1 
 LD_PRELOAD=/usr/lib/libv4l/v4l1compat.so skype 

 

Save the file and close gedit.

I’ve seen similar solutions that suggest renaming the actualy skype binary to skype.real and then name the shell file skype. I don’t reccomend that solution as future updates from the update manager will overwite your script. This way even if the update manager upgrades your skype install this will continue to work.

 

Step 2. Set permissions on the script file. In terminal chmod the file:

# sudo chmod 755 /usr/bin/skype.sh

Step 3. Modify the skype config.xml

This file is located in your home directory in a hidden directory called .Skype. You can get there a couple of ways. For those a little less comfortable with the command line, you can use the Places Menu and open your Home folder. This will open the Nautilus file manager. From the file manager menu select View->Show Hidden Files. Now you will see lots of folders and files that start with a dot. Look for the .Skype folder and doulbe click to open it. Inside that folder will be another folder with the same name as your user name. Open that folder and inside you will see a file called config.xml. Right click on that file and select “Open with Text Editor” from the menu. At the bottom of the file just above the closing </config> tag paste the following lines:

<Video>
<CaptureHeight>480</CaptureHeight>
 <CaptureWidth>640</CaptureWidth>
 <RecvPolicy>callpolicy</RecvPolicy>
 </Video>

Save the file and close gedit.

Step 4. Modiy the launcher.

We need to change the launcher command to make it use the new script file skype.sh we created in the /usr/bin folder. To modify it, right click on your gnome menu and select edit menus from the list. The Menu Editor program will open and you can click on the Internet icon in the left panel. The right panel will then show all the items in your internet menu. click on the Skype Icon and then select properties. This will open the launcher properties window. Change the command line to the following:

/usr/bin/skype.sh

Click close for the launcher and close for the menu editor. Done! You should now be able to use skype with your camera. Hope this helps you!

How To Implement a Knowledge Management System

How To Implement a Knowledge Management System:

A Practical Guide for Project Managers

By Russ Wright
 A Practical Recommendation for Project Managers to Implement A Knowledge Management System

Implementing a knowledge management system can greatly assist a project manager in their work. A study conducted by White and Fortune (2002) described the most important project success factors mentioned by project managers, which included: (1) having clear goals and objectives, (2) good support from the senior management, and (3) enough funding and resources to complete the tasks. In a paper on the benefits of knowledge management systems, Wiig (1997) explained that knowledge management systems, when properly implemented, could improve communication between departments and provide the users with a history of best practices within the organization. In a study conducted by Alavi and Leidner (1999) the authors described that an effective knowledge management systems assisted project management by providing better communication, shortening the time to find solutions to problems and better estimates of project duration. Thus a knowledge management system can help a project manager in all three of the important areas by providing the information they need to secure the project success factors. This paper will provide a background on the definition of knowledge and knowledge management models. Three knowledge management implementation models are then reviewed to demonstrate a progression of the research. Further, from a synthesis of the literature on knowledge management implementation, this document provides several factors that would help a project manager be successful at implementing a knowledge management system. The conclusion finds that the field of knowledge management and the process of implementation are still evolving.

Background

A Brief Discussion of Knowledge There is much debate among the scholars as to what constitutes knowledge within the context of knowledge management systems. Nonaka and Takeuchi (1995) defined knowledge as beliefs and commitments and not really information. Drawing from Polyanyi (1966), they used the concepts of tacit and explicit knowledge. Tacit knowledge was defined as personal, context specific and difficult to explain, this knowledge gained from experience can be lost if an individual leaves an organization and does not share it. Explicit knowledge was defined as the common knowledge known by a large
group and could be easily codified and shared. Zack (1999) defined codified knowledge as knowledge that is created, located, captured and shared that can be used to solve problems and make opportunities. Accordingly this type of knowledge, because it is captured, can become stale if not regularly revisited and evaluated (Gillingham & Roberts, 2006). Thus, knowledge, for the purpose of this paper, will follow the aforementioned two categories of tacit and explicit.

Knowledge Management

The knowledge management field is still fairly new, within the past three decades, and
many facets of the field are still unsettled. According to research by Rubenstein-Montano, Liebowitz, Buchwalter, McCaw, Newman & Rebeck (2001a), one of the bigger revelations in the past decade was the realization that knowledge management was far more
than technology for sharing knowledge as it also incorporated individuals, and the culture in which they worked. According to research by Bresnen, Eldman, Newell, Scarbrough and Swan (2003) the sharing of knowledge within and across projects was very difficult and developing the ability to share knowledge both within and across projects was a very important source of
competitive advantage for an organization. Thus the project manager, who wants to improve the quality of knowledge sharing through the implementation of a knowledge management system, needs to consider many factors to find a solution. For the project manager, finding a useful methodology and implementing it requires a good understanding not only of the methodologies, but also the technological constraints of the organization in which they wish to deploy a knowledge management system. Research conducted by Liebowitz and Megbolugbe (2003) identified several high-tech and low-tech solutions for knowledge management. Low cost
solutions included frequent face-to-face meetings between departments, perhaps over working lunches, to share tacit knowledge. If an organization required a low-tech virtual solution because they were spread out over a large distance, which made meeting in person difficult or impossible, they might have used on-line bulletin boards and facebook-like groups to share tacit knowledge in a virtual workspace. Research by Kasvi, Vartianen and Hailikari (2003) showed that these types of interactions, lunch meetings between departments and seminars, were described as some of the most important sources of knowledge. The more high-tech solutions explained by Liebowitz and Megbolugbe (2003) used expert systems to capture and codify knowledge into a repository and data and text mining software that looked for patterns to inductively create knowledge. These solutions were much more difficult to implement and required considerable IT investment and employee training. Kuhn and Abecker (1997) acknowledged the value of these systems and cautioned a balanced approach that flexes with the organization was required to make these systems function well within an organization. Thus, the project manager must soberly consider what models for knowledge management will fit into an organization’s capability and budget before attempting to find and implement a particular model.

Knowledge Management Models

The knowledge management models presented below are only a sample of the many
models found in the research literature. These models are representative of many of the other models as many share similar features and processes. The three presented below are an attempt to show the progression of the research as the models take on more complexity and at the same time attempt to explain and simplify the implementation process. The model presented by Wiig (1997), had four basic iterative steps: (1) review, (2) conceptualize, (3) reflect, (4) act as depicted in figure 1 below. The review process called for monitoring the internal performance of the organization against other organizations in the same industry to determine how well they are doing. The conceptualize step in the process began by organizing the knowledge by different levels. The author provided several examples of survey instruments that identified the knowledge assets and in turn associated them with the particular business
process that used them. Also at this step strengths and weaknesses in the knowledge inventory were identified. The reflectstep involved creating plans to improve the strengths and weaknesses previously discovered. And finally the act step was the implementation of the plan, which might be carried out by individuals in different parts of the organization. This process would be repeated to assist in the capture of knowledge.

Figure 1 Wiig’s knowledge management model


wig-km-model

A much more sophisticated model was presented by Rubenstein-Montano, Leibowitz, Buchwalter, McCaw Newman and Rebek (2001b) which according to the authors addressed several of the shortcomings of the other models. The authors argued that the
existing models lacked detail, did not include an overarching framework, and failed to address the entire knowledge management process. The model presented by the authors consisted of five phases: (1) strategize, (2) model, (3) act, (4) revise and (5) transfer as depicted in figure 2 below. Each phase of the model could loop back to the previous if it was determined that further work within a particular phase was required. The strategize phase covered the strategic planning; business needs analysis and a cultural assessment of the organization.
The model phase involved conceptual planning that covered knowledge audits and planning and a design of the plan to store and distribute the knowledge. The act phase focused on capturing, organizing, creating and sharing the knowledge. The revise phase consisted of implementing the system, reviewing the knowledge, and evaluating the achieved results. The
transferphase published the knowledge so it could be used to create value for the organization and consider expansion of the knowledge base.

Figure 2 Rubenstein-Montano et al. Model

rubenstein-montano-km-model

A later model presented by Chalmeta and Grangel (2008) tried to simplify the existing systems and provided a generic knowledge management implementation model. The authors argued that all knowledge management systems used some sort of computer system and therefore the implementation methodology should reflect the need for it. This model also consisted of five phases: (1) identification, (2) extraction, (3) representation, (4) processing and (5) utilization as depicted in figure 3 below. The identification phase focused on identifying the knowledge to be stored, and classified it into categories. The extraction phase involved transforming the knowledge from its existing state and putting it into the format used in the
knowledge management system. The representation phase created a model or diagram that showed a map of the knowledge in the system. The processing phase involved defining what technology platform was used to display and share the knowledge. The utilization phase involved deploying the knowledge portal and trained the members of the organization to use the system.

Figure 3 The Chalmeta and Grangel model

chalmeta-grangel-km-model

The models presented here are a sampling from the literature. The Wiig (1997) model for constructing a knowledge management system seemed very simple. Yet, Diakoulakis (2004) explained that this model was deceptive in the simplicity it portrayed because the model could “build, transform, organize, deploy and use knowledge” (p. 37). The Rubenstein-Montano et
al. (2001b) model tried to fix the shortcomings of models that came before it. In an attempt to generalize and simplify a knowledge management implementation model, Chalmeta and Grangel (2008) created another model, which included elements from both of the aforementioned models and attempted to create a much more generic and complete framework for the implementation of a knowledge management system. Regardless of which model is chosen to implement a knowledge management system within an organization, there are many factors that contribute to the success of the project.

Factors for Success

For the project manager, there are many factors to consider when deciding to implement a
knowledge management system. Below is a synthesis of many of the factors from existing research that will affect the ability of an organization to successfully implement a knowledge management system. They are: (1) managerial support, (2) a supportive culture, (3) incentives for motivation, (4) technology that matches the strategy, (5) ways to assess the value of the process, (6) specialists and processes, and (7) training. Each of these factors for success is discussed in detail below.

The Support of Management

Without the support of management the implementation of a knowledge management
system will not work. According to a study conducted by Holsapple and Joshi (2000) a major factor that contributed to the successful implementation of a knowledge management system were the behaviors of the management team who provided the impetus and the model of behavior that demonstrated a desire to use a knowledge management system. Another study by Massey Montoya-Weiss, and O’Driscoll (2002), who studied the implementation of a knowledge management system at Nortel Networks, explained that the managerial leadership provided control and coordination and most importantly they ensured that the knowledge management strategy was aligned with the business strategy. A similar study conducted by Sharp (2003) explained that the way employees acted in the implementation of the knowledge management system was a direct reflection on the behavior of management. Therefore the support of management not only provides the push to make it happen, it also requires them to set the tone which helps define the culture and acceptance of a knowledge management system.

The Proper Culture of Collaboration

The culture created by management will greatly influence the success of
an implementation of knowledge management system. In a recent paper by Anklam (2002) the author explained that knowledge management and creation requires collaboration on a much greater level. Individuals within the organization must develop a sense of trustworthiness between them that facilitates the sharing of knowledge. According to research by Ruggles (1999) knowledge management without a culture of collaboration will not succeed as collaboration is “strongly conducive to knowledge generation and transfer” (p. 300). Gold Malhorta and Segars (2001) explained that collaboration is important for the transfer of tacit
knowledge between individuals within an organization. The research conducted by Chourides, Longbottom and Murphy (2003) found that a coaching leadership style that established a learning culture was among the most significant factors for a successful knowledge management system implementation. Thus, the vision of management, which includes a vision of the organizational culture of collaboration, is required for the implementation of a knowledge management system to succeed.

Incentives for Motivation

Also included within the culture of an organization is motivation for the individuals in the form of incentives. According to research by Yahya and Goh (2002), the connection of rewards and compensation to an individual’s performance appraisals can have a positive impact on the motivation of an individual towards using a knowledge management system. Huber (2001) explained that to motivate individuals to share knowledge, the policies of the organization, in regards to rewards, must promote sharing. He further explained that the organization should publicize and celebrate instances of knowledge sharing that benefited the organization. The research conducted by Darroch (2005) seems to validate these earlier works as the author explained that the findings of his research showed that the knowledge sharing culture
of an organization was directly affected by performance incentives. Therefore, if management offers incentives, the workers within the organization will be motivated to share knowledge.

Technology That Matches the Strategy

Information and communication technology when matched to the business strategy for knowledge management plays an integral role in a successful implementation of a knowledge management system. The two major strategies for knowledge management are classified as codification and personalization. According to Zack (1999) codification is a process whereby tacit knowledge is captured in some electronic form and then shared about the organization thus making it explicit. He further explained that in this model, information technology is used like a pipeline to move knowledge around the organization. Because this model uses extensive technology and knowledge specialists to capture and store the knowledge, the monetary investment is very high. The second strategy for knowledge management, personalization, according to Hansen, Nohria and Tierney (1999), used information and communication technology to facilitate conversation from person to person where the participants transfered tacit knowledge. This model used much less technology and therefore the costs were much lower. It is important to note here that many scholars, (Alavi & Leidner, 1999; Borghoff & Pareschi, 1997; Wong, 2005), all stated that information and communication technology should not be considered an end unto itself and only considered a tool, as the wrong attitude towards technology can cause the entire knowledge management process to stagnate. Thus, matching the knowledge management strategy to the information technology budget of the organization will have significant impact on the successful implementation of a knowledge management system.

Assigning Value to the Process

Once a knowledge management system is in place, it will be important to express to management how well the system enhances the business strategy. This can be difficult as many of the benefits created from a knowledge management system are intangible, such as the good will and customer loyalty generated by the extra attention, and were very difficult to measure (Snowden, 2002). In research conducted by Park, Ribiere and Schulte (2004), management only
considered the implementation successful when there was some concrete way of measuring the positive impact of the implementation. This same attitude is echoed in the research conducted by Bose (2004), when he explained that the ability to measure the value of
a knowledge management system is critical to sustaining management’s support. He further explained that only with some way to measure the results could management assist in solving problems in the system. Jennex and Olfman (2008) further added that a successful implementation of a knowledge management system requires the ability to measure several factors of success, among them are: (1) information quality, (2) user satisfaction, and (3) system quality. They further explained that each of these factors add to the measurement of the benefits of implementing the system. Therefore defining   method to measure the success of the knowledge management system implementation, although somewhat difficult to define, not only informs management, but also helps the project manager to garner their continued support and sustained use of the system.

People and Processes

Special roles are needed to maintain the knowledge management system. According to Zack
(1999) there were specific roles required to maintain the knowledge management system within an organization, which included people to gather, refine and distribute the explicit knowledge throughout the organization, and IT support for the technology that held the repository. Grover and Davenport (2001) took this notion further and suggested a role of chief knowledge officer
that fulfilled many purposes including an indicator that an organization was serious about knowledge management. This role also served as the chief designer of the knowledge architecture. According to Coombs and Hull (1998) their research explained that
there also must be many knowledge managers within an organization that were familiar with knowledge management and facilitated the sharing of knowledge among different departments. Therefore these roles and responsibilities help to maintain the system and
show the support of the executive management.

Training

Individuals within an organization need to be trained, not only on the technology used to share
knowledge, but also to raise awareness of how to manage knowledge and see it as a valuable resource for the organization. Because the knowledge existed within the minds of individuals within the organization, without proper training an employee was not motivated to use a knowledge management system and share their knowledge (Bhatt, 2001). Research conducted by Hung, Huang, Lin and Tsai (2005) into critical factors for the adoption of a knowledge management system found that one of the biggest factors for successful implementation and increasing an organization’s competitiveness was the effective training of the employees to
recognize the importance of the knowledge management system. Another important factor for training employees was to give them a common language and perception of how they thought about and defined knowledge (Liebowitz, 1999). Therefore, training is a key success factor not only because they need to know how to use the knowledge management system, but also because it teaches the individual to recognize knowledge and understand the value that it represents to the organization. The seven factors for successful implementation of a knowledge management system within an organization outlined here give the project manager a
starting point for assessing the readiness of the particular organization. The project manager must consider how much support management, and especially senior management will give to the project. Another aspect requiring consideration is the culture of the organization. The project manager will have to reflect upon the culture of the organization and note if management is promoting a culture conducive to the plan. The culture created by management will also need to provide incentives to help foster sharing of knowledge among the members. A big consideration the project manager will have to undertake is the availability of
technology, and the people to support it. Some plans can be really expensive and a good review of the organizations technological infrastructure is needed before a serious plan can be made. Also of great importance is the training for the individuals who will use the system. Not only will they need to know how to use the system, but also how to recognize when something is knowledge worth storing.

Knowledge Management Implementation Is Still Evolving

It is clear from this research that a project manager who wants to improve the sharing of knowledge both within and across projects can benefit from a knowledge management system.
From the progression of knowledge management models demonstrated above, it is clear to see that researcher’s understanding ofhow to implement a knowledge management system is still evolving. The research presented on the factors for success demonstrates that research is still ongoing to understand the critical success factors for knowledge management system implementation. A core set of knowledge on how to successfully implement a knowledge management systems seems to exist, yet the constant evolution of technology seems to continue to change how a system might be implemented.

References

Alavi, M., & Leidner, D. (1999). Knowledge management systems: Emerging views and practices from the field. In Hawaii International Conference on System Sciences (p. 7009). Published by the IEEE Computer Society.

Anklam, P. (2002). Knowledge management: the collaboration thread. Bulletin of the American Society for Information Science and Technology, 28(6), 8-11.

Bhatt, G. (2001). Knowledge management in organizations: examining the interaction between technologies, techniques, and people. Journal of Knowledge Management, 5(1), 68-75.

Borghoff, U. M., & Pareschi, R. (1997). Information technology for knowledge management. Journal of Universal Computer Science, 3(8), 835-842.

Bose, R. (2004). Knowledge management metrics. Industrial Management & Data Systems, 104(6), 457-468.

Bresnen, M., Edelman, L., Newell, S., Scarbrough, H., & Swan, J. (2003). Social practices and the management of knowledge in project environments. International Journal of Project Management, 21(3), 157-166. doi:10.1016/S0263-7863(02)00090-X

Chalmeta, R., & Grangel, R. (2008). Methodology for the implementation of knowledge management systems. Journal of the American Society for Information Science & Technology, 59(5), 742-755.

Chourides, P., Longbottom, D., & Murphy, W. (2003). Excellence in knowledge management: an empirical study to identify critical factors and performance measures. Measuring Business Excellence, 7(2), 29-45.

Coombs, R., & Hull, R. (1998). ‘Knowledge management practices’ and path-dependency in innovation. Research Policy, 27(3), 239-256.

Darroch, J. (2005). Knowledge management, innovation and firm performance. Journal of Knowledge Management, 9(3), 101-115.

Diakoulakis, I. E., Georgopoulos, N. B., Koulouriotis, D. E., & Emiris, D. M. (2004). Towards a holistic knowledge management model. Journal of knowledge management, 8(1), 32-46.

Gillingham, H., & Roberts, B. (2006). Implementing knowledge management: A practical approach.Journal of Knowledge Management Practice, 7(1).

Gold, A. H., Malhotra, A., & Segars, A. H. (2001). Knowledge management: An organizational capabilities perspective. Journal of Management Information Systems, 18(1), 185-214.

Grover, V., & Davenport, T. H. (2001). General perspectives on knowledge management:

Fostering a research agenda. Journal of Management Information Systems, 18(1), 5-21.

Hansen, M. T., Nohria, N., & Tierney, T. (1999). What’s your strategy for managing knowledge?Harvard Business Review, 77(2), 106-116.

Holsapple, C. W., & Joshi, K. D. (2000). An investigation of factors that influence the management of knowledge in organizations. The Journal of Strategic Information Systems, 9(2-3), 235-261. doi:10.1016/S0963-8687(00)00046-9

Huber, G. P. (2001). Transfer of knowledge in knowledge management systems: unexplored issues and suggested studies. European Journal of Information Systems, 10(2), 72-79.

Hung, Y. C., Huang, S. M., & Lin, Q. P. (2005). Critical factors in adopting a knowledge management system for the pharmaceutical industry. Industrial Management & Data Systems, 105(2), 164-183.

Jennex, M. E., & Olfman, L. (2008). A model of knowledge management success. In Current Issues in Knowledge Management (pp. 34-52). Hershey, PA: Information Science Reference.

Kasvi, J. J. J., Vartiainen, M., & Hailikari, M. (2003). Managing knowledge and knowledge competences in projects and project organisations. International Journal of Project Management, 21(8), 571-582. doi:10.1016/S0263-7863(02)00057-1

Kuhn, O., & Abecker, A. (1997). Corporate memories for knowledge management in industrial practice: Prospects and challenges. Journal of Universal Computer Science, 3(8), 929-954.

Liebowitz, J. (1999). Key ingredients to the success of an organization’s knowledge management strategy. Knowledge and Process Management, 6(1), 37-40.

Liebowitz, J., & Megbolugbe, I. (2003). A set of frameworks to aid the project manager in conceptualizing and implementing knowledge management initiatives. International Journal of Project Management, 21(3), 189-198. doi:10.1016/S0263-7863(02)00093-5

Massey, A. P., Montoya-Weiss, M. M., & O’Driscoll, T. M. (2002). Knowledge management in pursuit of performance: Insights from nortel networks. MIS Quarterly, 26(3), 269-289.

Nonaka, I., & Takeuchi, K. (1995). The knowledge creating company: How Japanese companies create the dynamics of innovation. Oxford, UK: Oxford University Press.

Park, H., Ribiere, V., & Schulte, W. (2004). Critical attributes of organizational culture that promote knowledge management technology implementation success. Journal of Knowledge Management, 8(3), 106.

Polanyi, M. (1966). The tacit dimension. London: Routledge and Kegan Paul.

Rubenstein-Montano, B., Liebowitz, J., Buchwalter, J., McCaw, D., Newman, B., & Rebeck, K. (2001a). A systems thinking framework for knowledge management. Decision Support Systems, 31(1), 5-16. doi:10.1016/S0167-9236(00)00116-0

Rubenstein-Montano, B., Liebowitz, J., Buchwalter, J., McCaw, D., Newman, B., & Rebeck, K. (2001b). SMARTVision: a knowledge-management methodology. Journal of Knowledge Management,5(4), 300-310.

Ruggles, R. (1999). The state of the notion: knowledge management in practice. The Knowledge Management Yearbook 1999-2000, 295.

Sharp, D. (2003). Knowledge management today: Challenges and opportunities. Information Systems Management, 20(2), 32.

Snowden, D. (2002). Complex acts of knowing: paradox and descriptive self-awareness. Journal of knowledge management, 6(2), 100-111.

White, D., & Fortune, J. (2002). Current practice in project management: An empirical study.International Journal of Project Management, 20(1), 1-11. doi:10.1016/S0263-7863(00)00029-6

Wiig, K. M. (1997). Knowledge management: Where did it come from and where will it go? Expert Systems with Applications, 13(1), 1-14. doi:10.1016/S0957-4174(97)00018-3

Wiig, K. M., De Hoog, R., & Van Der Spek, R. (1997). Supporting knowledge management: a selection of methods and techniques. Expert systems with applications, 13(1), 15-28.

Wong, K. Y. (2005). Critical success factors for implementing knowledge management in small and medium enterprises. Industrial Management And Data Systems, 105(3/4), 261.

Yahya, S., & Goh, W. K. (2002). Managing human resources toward achieving knowledge management. Journal of Knowledge Management, 6(5), 457-468.

Zack, M. H. (1999). Managing codified knowledge. Sloan Management Review, 40(4), 45-5

Towards Understanding Deploying Open Source Software

Towards Understanding Deploying Open Source Software:

A Study of Factors That Can Impact the Economics of an Organization.

Abstract

This purpose of this paper is to present the factors that can impact the decision to deploy Open Source Software (OSS) in an organization. Many organizations fail to understand the economics involved in using open-source software and consequently suffer poor results. An explanation of the Open Source Initiative (OSI), and a definition of Free / Libre Open Source Software (F/LOSS) is provided along with a discussion of the benefits and pitfalls of deployment in the context of the value chain. The results conclude that the benefits outweigh the risks and profit/benefit is possible if the economic impact is understood.

Towards Understanding Deploying Open Source Software

Without the technology that runs Information Systems (IS) most organizations would cease to function. The business model of a typical organization is tied to the systems and technology that it uses. Since the advent of e-commerce, and the market for products becoming nearly global, organizations are constantly looking for innovation in technology that will give them a competitive edge. A recent study pointed out that the advantage is temporary as other competing companies will copy or innovate even newer and cheaper technology which creates a perpetual requirement to adapt business processes (Ward and Peppard 2004). Open Source Software (OSS), if properly implemented, can become a key part of the innovation and adaptation of business organizations helping them to maintain a competitive edge.

Background

Ever since Porter (1996) introduced the Value Chain Analysis business concept in Harvard Business Review, consultants have tried to use the subsequent methodology to evaluate the quality of each link in the chain. A value chain is a series of tasks and interrelated activities performed by an organization to produce a product. As this product passes through each activity or “link” in the chain, where it is somehow refined, and the product takes on more value. The final sum of the value of all the refinement activity is more than the value that is created at each step in the chain. One way an organization might improve the quality of the links would be to use OSS in their Information Systems (IS).

There are many advantages to using OSS to improve the value of each link and the relationships between them. How IS is used by a company has a significant influence on the relationships between the activities in a value chain (Porter and Millar, 1985). IS helps a company to create and maintain competitiveness because competitiveness flows from creating value for the customer. The activities that create value for a company, such as purchasing, production and sales, are not independent but rely on each other in the value chain. Porter and Millar (1985), concluded that the proper use of information technology minimizes costs while maximizing value, optimizing value activities, and guaranteeing competitive advantages (p 151). These relationships between activities can be strengthened by good use of IS, and quality OSS to improve competition and create greater value.

The Open Source Initiative

According to the website About the Open Source Initiative (n.d.), The Open Source Initiative (OSI), was incorporated in 1998, and is a non-profit corporation who’s primary purpose is to provide education about Open Source Software (OSS) and to advocate for the benefits of OSS. The OSI also claims they exist to build bridges among the constituents of the open-source community. They further claim that one of their most important activities is to act as a standards body that maintains the definition of OSS. They also hold a trademark called The Open Source Initiative Approved Licensearound which they attempt to build a nexus of trust so that all parties involved can cooperate on the use of OSS.

Definition of F/LOSS

The definition of free/libre open source software (F/LOSS) is often misunderstood. The free part of the definition is about the liberty of use with the product and not about the price. The Free Software Foundation (2010) maintains the following definition:

“Free software” is a matter of liberty, not price. To understand the concept, you should think of “free” as in “free speech,” not as in “free beer.” Free software is a matter of the users’ freedom to run, copy, distribute, study, change and improve the software. More precisely, it means that the program’s users have the four essential freedoms:

  • The freedom to run the program, for any purpose (freedom 0). The freedom to study how the program works, and change it to make it do what you wish (freedom 1). Access to the source code is a precondition for this.
    The freedom to redistribute copies so you can help your neighbor (freedom 2).
    The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.

Embracing the Business Model

The decision to deploy OSS in an organization is difficult at best. The idea of embracing the Open Source Initiative (OSI) business model as a business strategy is even more frightening for organization executives. There are dissenting opinions on the value of using OSS from many directions that attempt to sway the decision for many reasons. Some reasons are based on misinformation, others are based in fallacious economic policies and others based in fear. Behlendorf (1999), explained that the OSI model is not for everyone, and it is often implemented incorrectly and then blamed for the failure. In fact, the failure is more often result of a poor understanding of how to embrace the model and deploy OSS than the OSS itself.

All or Nothing?

There are multiple ways to deploy OSS in an organization it is not an all-or-nothing approach. For example a school might decide to deploy open-source office suite, such as Open Office, in place of the commonly used Microsoft Office. Another example might be a commercial company decides to use a module of OSS code to provide functionality in a product or as a value add-on to an existing proprietary product. For example, the company Objectweb had several products that embed OSS components. In the most extreme example, a company might decide to open the source code of one of their products to try and create a competitive advantage and gain market share. One of the most well known examples of this move is the Netscape corporation who opened the source to their web browser Mozilla. Each of the methods of deploying OSS share some economic impact commonalities that need to be considered before making the move to OSS.

The implementation of OSS, if properly understood, can be a reliable asset in the value chain. One of the biggest hurdles to overcome is the idea that the source code, once kept secret and valued as the company profit maker is now opened and shared, even with competitors, to produce a better product. In a recent study showed that if a company decides to open the source code to a product, they will often have greater profit if they have a complimentary product that the OSS application enhances. This allows the company to exploit the benefits of combining the open and closed products (Haruvy, Sethi and Zhou, 2008). In a related study the authors explained that not all code has to be shared, those pieces that differentiate and make the organization competitive can be kept and sold separately (Behlendorf, 1999). Instead allowing the core code to be open and shared allows for innovation that can strengthen and enhance the product and provide new paths for innovation.

The Symbiosis

There is a symbiotic relationship between an organization the chooses to use OSS and the OSI community. When the two work together well, success is likely. In a recent study there were four factors that defined the most successful projects. The findings showed that projects that did well had the following characteristics: (1) a solid core developer group, (2) an active peripheral group in communication, (3) a high level of communication exemplified by a depth of threaded communications and (4) only moderate dependence on the internal community for beta testing. The research finds a direct correlation and predictability in the level of communication and the code development within the project (Vir Singh, Fan, and Tan, 2007). This means that if these qualities are shared between an organization choosing to deploy and use OSS then there is more of a chance for success. There are many examples of companies that have successfully opened their source code and created a symbiotic relationship with the OSI community.

The Mozilla Example

Organizations, even commercial firms, can benefit from opening their source code. One possible benefit is to gain ground against a competitor. In a research study Lerner and Tirole (2005), explained how Netscape in 1998 decided to open the source code of a portion of it’s browser “Mozilla”. At that time Internet Explorer was dominating the browser market and Netscape was only holding a tiny share. The study further showed how the web browser application became more accepted, and by opening this source code, Netscape experienced a market share and profit increase. This means it is possible for an organization to remain commercial, open the source code and make a profit. When the point of the OSI is understood, there is great value in the process.

The Value In Using F/LOSS

About ten years ago OSS was relegated to the role of operating system (Linux) and web server (Apache). Now OSS is firmly in the middle layer providing databases (MySQL, Postgresql) and Email servers (Qmail, Horde) among many other functions. OSS has two very distinct properties: the source code is accessible and the developer has the liberty to modify the code any way they want and redistribute pieces of it or the entire source code (Thomas and Hunt, 2004). This is important, particularly to the OSI community, because access to the source, and the liberty to reuse it, creates the opportunity to use existing software as a launching point to create new products based on new designs. This gives the developer access to high quality source code readily available with a few mouse clicks in a web browser. The liberty to use the code as desired and the availability of the source code removes several boundaries that exist when attempting to reuse code. A developer may choose to use a few lines of code, an entire class or an entire system (Frakes and Terry, 1996). This means the reuse of the source code has several advantages including the ability to break out of a lock in with a specific vendor, produce high quality products, get the product to market quicker, and foster innovation which adds value and increases customer satisfaction.

OSS is developed by many groups, often spread out globally. Most often the initial creation of the software program is done to fulfill a need by a single group of developers at one company or school. This group will release the code to the public before it is completed, usually to help spur further development. Perens (2005) explained that other companies may pick up this code and extend or adapt it to their use. This means new features and functionality can be added or adapted as needed by the organization that chooses to use OSS. The code is often of very high quality and developers are offered incentives to write high quality code.

The Quality of the Code

One common objection to using OSS as part of the IS in an organization’s strategic portfolio is the quality of the source code. Quality, in this instance, is defined as well-designed, well-written, relatively error free and functional. There are two major influences on the quality of open-source code that drive the level of quality in an open-source project. Gacek and Arief (2004), explained that the structure of an OSI development community is based most closely on a meritocracy. The better the quality of the code written, the more merit the developer has within the community. Perens (2005) explained that the developers who modify OSS code have an incentive to write the code well so that their changes get incorporated into the main body of the code. That way they don’t have to spend money to re-integrate their changes into the existing code base every time they want to implement an update. The value of a meritocracy is that a developer has incentive to write quality code and to become a respected member of the community. These incentives help to promote high quality, well written code in the OSS projects.

Some argue that the quality of the OSS code is so good, it is unfair to compare it to proprietary or closed source projects. McConnell (1999), stated that successful OSI projects should not be compared to regular closed source projects and instead should be compared with software development effectiveness as it is achieved by companies on the leading edge using multiple practices to achieve high quality software. This means the quality should be compared based on the methodologies used by the companies to determine if within the methodology, the quality of the code is equal. Golden (2008) commissioned a case study by Coverity, a company that tests software quality, and found that OSS programs averaged half the number of the bugs per thousand lines of code when compared to proprietary programs (p. 36). This study shows the quality of OSS code, when compared to equivalent proprietary programs is superior.

Breaking Vendor Lock-In

One aspect of OSS that can present an opportunity to create a competitive advantage is the ability to modify the source to meet the organization’s business model. A recent study showed that a major strategy used by software vendors is to keep the cost of switching to another product high by implementing proprietary data formats and keeping tight access on the source code (Carillo and Okili, 2008). When an expensive proprietary product is purchased for use in a company, Castelluccio (2008), explained the product often becomes “sticky” or “locked-in” either because management wants to earn a positive return on the investment or because the cost of switching would be excessively high. For example, the data storage format could lack interoperability with other vendor’s software preventing import and export of information. These concerns can also be accompanied by the inability to fix bugs in a timely manor or get the vendor to respond quickly to the organization’s need. Golden (2008) showed one of the primary reasons organizations are adopting OSS is because it includes the source code, the organization is free to strip out unneeded functions and make repairs themselves thus eliminating dependence on the vendor. The study further showed that because no single vendor controls OSS so the organization is free to pick a different provider to support their program. For the organization this means the liberty to make the changes they need, customize the software to fit their business model, which in turn, will help increase value of individual links within the value chain and increase a competitive advantage.

Quicker To Market

Using OSS can provide a competitive advantage especially when it is used in the development of new applications. According to a research study by Ajila and Wu (2007), there is a strong correlation between the ability to get product to market sooner and the adoption of OSS, thus providing economic gains in both productivity and product quality. As long as the source code meets that needs of the project, the implementation of code into a project will shorten the development time. That means by adding OSS to a project the development time can be shortened, the cost to develop decreased and the time to market reduced. These factors can give an organization an advantage and the ability to strengthen the bond in the value chain and thus add value to the product.

Innovation

Innovation is truly a key part of the OSI movement. The ability to reuse existing code and create something new is vital to the success of an organization. Vujovic and Ulhøi (2008), explained that the model known as open innovation, now seen in a global market, allows for greater innovation in product research and development. Companies’ ability to stay competitive is no longer exclusively determined by efficient cost management and marketing capabilities. Rather, it relies increasingly on the continuous development of new and superior products and services in a business environment characterized by growing instability (with regard to consumer preferences and technology development). A recent study found that within a three year period a small core of developers, assisted by a transitory group of less committed developers were able to create several new products using existing OSS from sourceforge. The ability to use the code gave them a measurable advantage in creating applications (David and Rullani, 2008). This means the competitive edge has to come from a wider body of developers and researchers and with such strong competition in a global market, using the OSI model, where innovation happens across a wide community, will help an organization stay competitive.

The Pitfalls

There are many ways to use OSS improperly. When choosing to deploy OSS to increase or create a competitive advantage, there are just as many ways to implement the project improperly as there are to create success. Before an organization decides to incorporate a single line of OSS code into a project, or to open their source to the world, there are some concerns that bear a sober review.

No Reliable Release Schedule

The OSI community does not employ the same controls for release schedules like a commercial application. According to a recent study the level of control exerted on an OSS development project is much lower that a commercial counterpart and thus have a less reliable release schedule. Only the most active and highly used projects, such as Apache, have reliable release schedules (Capra, Francalanci, and Merlo 2006). This means the release schedules are not exact and may not deliver as planned. If an organization is counting on the deployment of a certain OSS project, there is a risk that the project might not deliver what is planned or promised as they do not exert the same type of control of release dates.

The Developer Skill Set

The developers who will implement OSS in an organization do not need special skills, but do need an understanding of OSS and how to reuse source code properly. A recent study showed that while there is no strong statistical significance between OSS reuse skill and experience, and software development economics, there is some statistical correlation pointing to the fact that software reuse experience and skill in general is important when reusing OSS (Ajila and Wu, 2007). That means it is important that the developers engaged in the process have at least a general understanding of code reuse and some familiarity with OSS before attempting to do an integration. Otherwise failure and cost overruns from a lack of preparation are likely.

Ideology Over Pragmatism

Although OSS offers many benefits it does not fit in all situations and a clear pragmatic perspective is required because relying only on the ideology can lead to missing out on possibly better solutions. OSS often makes it’s way into a company through developers who work on projects. Their mindset towards OSS can be ideological and prevent them from considering alternatives. According to a study by Ven and Verelst (2008), if the mindset of the developer is ideological in reference to OSS, their decision will not be pragmatic but instead they will have a strong preference for using OSS, without properly considering proprietary alternatives. Suitability of the OSS solution is sometimes overlooked and in the organization-specific context, and new innovations might be ignored. This means the developers, who are often the decision makers about what products to use might instead use an OSS solution that does not fit as well because they are adverse to considering proprietary solutions or do not properly consider all alternatives and find the best fit.

Maintenance

Another pitfall to deploying OSS within an organization is failure to create the symbiotic relationship with the OSI community. An organization could chose to download, modify and deploy OSS solutions within their value chain to gain a competitive advantage. According to Dahlander (2004), some organizations stop at this point and for a time, enjoy the benefit of using high quality code for free in their primary or secondary value chain activities. This however, is a short sighted plan and only a short term gain is realized as there is still cost associated as problems arise when the need for updates and upgrades appears. The organization may attempt to go back to the project where they acquired the source code only to find the project discontinued or morphed into something different that is far from meeting their needs. The modifications made to the original OSS code should be returned to the community so that they are incorporated into the code base, thus helping to maintain a symbiosis. The maintenance of the OSS code is equally important as the value gained from the low price. This means an organization should take the time to create the relationship with the OSI community and not focus solely on the short term gain realized by inserting free source code into their value chain.

Legal Issues

One important part of using OSS in a product that is distributed outside the organization is to preserve the intellectual property rights of the people who worked hard to create the software. When choosing to reuse OSS or incorporate the code into a product that is distributed outside the organization credit must be given to the original developer and the source code must be made available or the organization would be in violation of the license given with OSS. According to research performed by Walsh and Tibbets (2010), infringing a registered copyright carries with it the risk of statutory damages, injunction against shipping products incorporating the OSS, and possibly other penalties; some may find this a surprising consequence of using “free” software. That means misunderstanding the license that comes with OSS can have detrimental effects upon an organization. Use of the software is provided as long as the organization complies with the terms of the license.

Benefits Outweigh The Risks

The choice of an organization to integrate OSS into the application portfolio is not a simple task. The choice to distribute OSS as part of a product can lead to legal issues if the license assigned to the code is not fully understood. The benefits, though mixed, overall outweigh the risks according to the research offered in this paper. If an organization understands the risks and benefits of deploying OSS, and follows a systematic model for the implementation and reuse of code, there is a good chance they will achieve some economic gain from a shorter development cycle and equally important, a high quality product.

Further Study

A future project to add to this study could be a qualitative study involving several interviews with executives, mangers and developers to provide insight into the intent of each group and help understand how each of the aforementioned factors impacted each group’s decision to participating in an OSS project. Another possible project to add to to this study could be a quantifiable survey of executives, mangers and developers to provide an analysis and predictability of success based on the level of understanding of each of the aforementioned factors that impact the decisions to deploy OSS.

References

About the Open Source Initiative (n.d.). Retrieved from http://www.opensource.org/about

Ajila, S., & Wu, D. (2007). Empirical study of the effects of open source adoption on software development economics. The Journal of Systems and Software, 80(9), 1517-1529. doi:10.1016/j.jss.2007.01.011

Behlendorf, B. (1999) Open source as a business strategy. In Dibona, C., Ockman, S. & Stone, M. (Eds.), Open-sources: Voices from the open source revolution,(pp.149-170). O’Reilly. Retrieved from http://oreilly.com/catalog/opensources/book/brian.html

Capra, E., Francalanci, C., & Merlo, F. (2006). An empirical study on the relationship between software design quality, development effort and governance in open source projects. IEEE Transactions on Software Engineering, 34(6), 765-774. doi:10.1109/TSE.2008.68

Carillo, K., & Okoli, C. (2008). the open source movement: a revolution in software development.Journal of Computer Information Systems, 49(2), 1-9. Retrieved from Business Source Complete Database.

Castelluccio, M. (2008). Enterprise open source adoption. Strategic Finance, 90(5), 57-58. Retrieved from Business Source Complete Database.

Dahlander, L. (2004). Appropriating the commons: Firms in open source software. International Conference on Software Engineering: St. Louis Missouri. doi:10.1145/1083258.1083269

David, P. & Rullani, F. (2008). Dynamics of innovation in an “open source” collaboration environment: Lurking, laboring, and launching FLOSS projects on sourceforge. Industrial and Corporate Change. 17(4), 647-710. doi:10.1093/icc/dtn026

Dornan, A. (2008). The Five Open Source Business Models. Retrieved May 25, 2010, from http://www.informationweek.com.

Frakes, W., & Terry, C. (1996). Software reuse: Metrics and models. ACM Computer Surveys, 28(2), 415-435. doi:10.1145/234528/234531

Free Software Foundation. (2010). The free software definition – gnu project. Retrieved May 25, 2010, from http://www.gnu.org/philosophy/free-sw.html

Gacek, C., & Arief, B. (2004). The many meanings of open source. IEEE Software, 21(1) 1359-1360.doi:10.1109/MS.2004.1259206

Golden, B. (2008). Open source in the enterprise: an o’reilly radar report. O’Reilly Media, Inc. 1st Ed.

Haruvy, E., Sethi, S., & Zhou, J. (2008). Open Source development with a commercial complementary product or service. Production and Operations Management. 17(1), 29-43. Retrieved June 4, 2010, from ABI/INFORM Global. (Document ID: 1477481991).

Lerner, J., & Tirole, J. (2005). The economics of technology sharing: Open source and beyond. Journal of Economic Perspectives, 19(2), 99-120. Retrieved from ProQuest Database.

McConnell, S. (1999). Open source methodology: Ready for prime time? IEEE Software, 16(4), 6-8. Retrieved from Business Source Complete Database.

Perens, B. (2005). The emerging economics of open source software. Retrieved May 11, 2010, fromhttp://perens.com/Articles/Economic.html

Porter, M. (1996). What is strategy? Harvard Business Review, 74(6), 61-78. Retrieved from ProQuest Database.

Porter, M., Millar, V. (1985). How information gives you competitive advantage. Harvard Business Review, 63(4), 149-160. Retrieved from Business Source Complete Database.

Thomas, D., & Hunt, A. (2004). Open source ecosystems. IEEE Software, 21(4), 89-91. doi:10.1109/MS.2004.24

Ven, K., & Verelst, J. (2008). The impact of ideology on the organizational adoption of open source software. Journal of Database Management, 19(2), 58-72. doi:10.1109/MS.2008.73

Vir Singh, P., Fan, M., & Tan, Y. (2007). An empirical investigation of code contribution, communication participation and release strategies in open source software development: A conditional hazard model approach.Journal of Information Systems and Operations Management. Retrieved from the MIT Open-Source database.

Vujovic, S., & Ulhoi, J. P. (2008). Online innovation: The case of open source software development.European Journal of Innovation Management, 11(1), 142-156. doi:1087747

Walsh, E., & Tibbetts, A. (2010). Reassessing the benefits and risks of open source software.Intellectual Property & Technology Law Journal. 22(1), 9-13. Retrieved from the ProQuest Database.

Ward, J., & Peppard, J. (2004). Beyond strategic information systems: Towards an IS capability. Journal of Strategic Information Systems, 13(2), 167-194. Retrieved from ProQuest Database.

Value Creation/Innovation

Value Creation/Innovation:

Exploring Value Creation Theories

By Russ Wright

Value Creation

Value creation or innovation as defined in the question above, focuses on creating new products and new ideas within the field of software development. There is much debate in the literature over how value is created within software development. Some literature focused primarily on the economy of software development (Boehm, 2003; Boehm & Sullivan, 2000). Others took a more holistic approach and realize that the culture and process have the greatest effect on the ability to innovate (Highsmith & Cockburn, 2002; Karlsson & Ryan, 2002; Little, 2005; Prahalad & Ramaswamy, 2004; Quinn, Baruch, & Zien, 1996). Regardless of the path taken to explain value creation and innovation in software development, the general agreement was that the process needed to change to better fit the challenges and opportunities of a global market. Inasmuch, this paper will explore the value creation theories related to software development and the additional challenge of outsourcing to create innovation.

The purpose of this document is to explore how an organization can create value in the software development process. There is a discussion of the background on value creation within the context of agile software development principles, which contain many ways to innovate. This document also explores the theories of outsourcing software development and how they relate to value creation or innovation within agile development. The conclusion finds that creating value in software development is still a topic up for debate among scholars and that outsourcing can add value but is full of pitfalls.

Value Creation Theories Related To Software Development

Achieving value creation has slowly changed the way software is developed into a set of methodologies loosely called agile software development. Research conducted by Fowler and Highsmith (2001) concluded that the agile development philosophy and methods sought to remove cumbersome and time-consuming barriers to value creation. The authors instead supported a philosophy of software development that focused on the individuals, their interaction, creating working software, collaboration with the customer and quick responses to change. There are several theories of value creation or innovation embedded within the principles of the agile philosophy of software development. Therefore, several of these embedded value creation theories are explored below.

Customer Defined Software Value

One theory of value creation in software development is that customer satisfaction is achieved by delivering software that the customer actually values. According to research by Highsmith and Cockburn (2002), the authors explained that within the agile development methodologies the customer defined value for the project and set the measurement of success. For software to be valuable the customer had to find that the product not only met their needs, but also was usable and useful (Constantine & Lockwood, 1999). In an article on agile methodology, Boehm (2002) explained that generating value in the development process for the customer was achieved by emphasizing customer involvement in the development process over the traditional contract negotiation. Consequently, value creation comes from the ability of the customer to define the measure of success, including usability and usefulness of the product and their involvement in the development process as a team member. Making the customer a member of the team means that the requirements might change, many times and even late in the process.

Requirements Changes

A change in the requirements for an application, even late in the development process allows a customer to build in greater value to the product. The ability to prioritize the customer’s requirements through a cost-value approach produced a win-win result when creating software products (Karlsson & Ryan, 2002). Prahalad and Ramaswamy (2004) explained that changed requirements happened because the customer and development team developed a greater understanding of the customer’s needs. Recent research conducted by Paetsch and Eberlein and Maurer (2003) explained that the ability to adapt to the changing situation in the software development project created more value than prediction of the customer’s requirements. A study by Cao and Ramesh (2008) offered a warning against too much change in requirements leading to project failure because the customer never saw the software product. Thus the ability to change requirements is a positive, yet too much change can prevent a usable product from being delivered.

Quick Delivery

The ability to deliver the software quickly builds value for the customer. According to a recent study by Quinn, Baruch and Zein (1996) quick delivery of the software product made a significant difference in the ability to compete in the global market. Larman (2004) explained that innovation was accomplished when the developers, through an iterative cycle, delivered a working product to the customer, which allowed the customer to see the progress and adjust requirements quickly to the changing market. Yet, Nurer Mahapatra and Magalaraj, (2005) in a recent paper cautioned that organizations who attempted to adapt to agile methods that delivered quickly could fail because they were often unprepared for the radical change in behavior. Therefore, quick delivery of the software product builds value by allowing the customer to adapt and refine the product but also holds a potential for disaster if not managed well. Managers, customers and developers have to work together well to build value.

Collaboration with Managers

The development team and managers must communicate daily to build value in the development process. Cohn and Ford (2003) explained that managers had to adapt to a new style of leadership where they were required to relinquish some of what they perceived as control. The authors posited that traditional development plans which offered specific delivery dates were probably padded and inaccurate and instead needed participation to see that they could deliver the product quicker and with less resources. According to research conducted by Patton (2002) the regular discussions with managers added the benefit of clearing roadblocks and bottlenecks, which would slow down the project and add costs. The managers were also able to see how the product met the customer’s needs. A study conducted by Augustine, Payne, Sencidiver and Woodcock (2005) explained that without the constant communication, managers would often fall back into trying to manage the project using linear approaches as they attempted to gain control of the project, which lead to lost time and possibly project failure. As a result, the constant communication with managers helps create value by not only keeping them informed but also empowering them to clear obstacles that might slow down delivery. Managers can also provide considerable motivation and support to help create value in the software development process.

Motivation and Support

The motivation and support of the management team has a significant impact on the value creation in software development. According to research conducted by Ceschi, Sillitti, Succi and Panfilis (2005) into development project success factors found that team members ranked motivation from management in the form of support and training as among the top key factors for innovation. The study further showed that managers agreed that their support was a significant factor in the success and value creation within the project. A recent study conducted by Asproni (2004) into the benefits of motivation from management in software development teams showed that highly effective teams benefited most and achieved the best results when management provided among other factors, clear elevating goals, a unified commitment to the project and a collaborative climate. The study further explained that this support gave the team members the ability to innovate and build better quality software, often with fewer resources. A research survey conducted by Forward and Lethbridge (2002) found that the management team provided a significant impact on the performance of the development process when they provided the team with the proper tools to improve automation in the development process. The team saw these tools as support and found motivation to perform at a higher level. Inasmuch, support and motivation from management has a reciprocal effect on the development process and the ability of the developers to innovate and create value.

In Person Meetings for Tacit Knowledge Transfer

Value creation can be significantly effected by the ability of team members to meet and share knowledge. Nonaka and Takeuchi (1995) drew on the work of Polyanyi (1966), and explained that tacit knowledge was personal, context specific and difficult to explain. They further explained that this knowledge, gained from experience, can be lost if an individual leaves an organization and does not share it. Likewise the members of a development team, which includes the customer, must meet often to share knowledge to establish the context of the requirements and build new knowledge, which creates value (Dyba & Dingsayr, 2008). Cohn (2004) explained that when teams met to transfer knowledge, the use of stories to explain the requirements helped to build up individual and group knowledge as they shared. The author does warn that this process might not work well in really large teams, but did acknowledge the positive impact of tacit knowledge sharing on innovation. A recent study conducted by Chau, Maurer and Melik (2003) found that knowledge sharing within agile development teams, particularly when done in face-to-face settings helped to build trust among team members and increased the team’s ability to function together. Therefore, sharing of knowledge among team members, especially in face-to-face formats helps build team trust and create value in the development process.

Working Software as Measurement Of Success.

The ability for management to measure the progress of a development project has a significant impact on the ability of the software development team to create value. Boehm and Turner (2005) explained that agile development processes do not include the typical milestones and other measurement techniques common to traditional development methods. They further explained that the completed functional stories could serve as a replacement for these measures as they showed the amount of work completed on a particular development phase. A case study by Fitzgerald, Hartnett and Conboy (2006) demonstrated that the ability of the management team to measure the progress of the team, based on the amount of working code, increased the project performance by reducing the amount of paperwork needed in the previous traditional software development projects. The developers spent less time writing reports and more time on the actual development, which accelerated the development process. In a recent report, Lapham, Williams, Hammons, Burton and Schenker (2010) explained that progress within an agile project was measured by gathering the customer’s value assigned to each completed part of the development project. As these pieces were completed, they were used as a measure of how much the customer valued the product at that time. Thus, the ability of management to measure progress, along with the customer’s assigned value at each phase of development adds value to the project, particularly because it reduces the reporting workload on the development team freeing them to perform more development tasks.

Realistic Schedules

The proper attitude towards development cost and schedule when managing a project will have an impact on the development team’s ability to create value. Glass (2001) explained that successful projects had a realistic schedule that was not a death march to finish the project and the developer worked a normal workweek. Developing innovative products did not require that the software development projects be managed by the schedule or costs, as it would have distracted from the real objective of creating a profitable product that met customer needs and gave a competitive advantage (Poppendieck & Poppendieck, 2003). Likewise, and exploratory study by Begel and Nagappan (2007) explained that one of the benefits of implementing agile software methods was the flexibility of the process which gave the developers the ability to change directions when a rigid schedule would not have worked. So, a tight management of the schedule and costs, as is common in traditional development processes would inhibit value creation because they lacked flexibility.

Technical Excellence

The quality of skill for each member of a development team will have an impact on the value creation ability of an organization. An organizational culture that supported and provided opportunities for growth in skills was desirable because it lead to productivity (Wendorff, 2002). A recent study conducted by Chow and Cao (2008) showed that team capability and delivery strategy ranked highest as critical success factors. Technical excellence also extended to the tools used by the software development team, as good quality tools impacted the ability of the team to innovate (Hanssen & Fægri, 2008). For these reasons, a skilled software development team equipped with the proper tools and supported by opportunities for growth is an important factor for innovation and value creation in software development.

Keep It Simple – Maximize Effort

The design of the software program can impact the ability of the software development team to create value in the development process. In the agile development process, one of the first steps is writing the test for the specific functionality. By writing the test for the code first, the developer would write code to the test and minimize the additional code needed to meet the test requirements and functionality (Poppendieck & Poppendieck, 2003). In a recent paper by Lindstrom and Jeffries (2004) the authors explained that value was achieved by keeping the design as simple as possible so that the design matched the functionality and included no additional wasted motion. They further explained that the design was regularly reviewed to keep effort to the minimum and maximize efficiency. Hence, value creation in software development can be improved by coding only what is required and reviewing the code regularly to increase efficiency.

Team Self-Organization

The ability for a software development team to reorganize themselves into different configurations as the situation dictates can affect the organization’s ability to innovate. Cockburn and Highsmith (2002) explained that the ability of a software team to reorganize as the situation dictated was important for making decisions quickly and dealing with ambiguity. A recent paper by Decker, Ras Rech, Klein and Hoecht (2005) explained that the ability to reorganize as a development team allowed for the reuse of engineering knowledge in new projects. The benefit of knowledge reuse was a key factor in the reasons given for reorganizing the team to fit the new situation. Yet, there is reason to show caution when attempting to use a self-reorganizing team philosophy. A study conducted by Moe, Dingsoyer and Dyba (2008) found that the very specialized skills of certain team members and the uneven division of work among the team presented barriers to realizing a true self-organizing team. Therefore, value creation, in regards to self-organizing teams depends upon the ability of a development team to reorganize quickly to meet new challenges, and the balancing of skills and workload within the team.

Team Reflection

The ability for a development team at regular intervals to reflect upon the entire project can impact the ability of the team to create value. According to research conducted by Salo,Kolehmainen, Kyllönen, Löthman, Salmijärvi and Abrahamsson (2004) the post-iteration workshops provided significant help to improve and optimize practices and enhance the learning and satisfaction of the project team. The authors further explained that the cost of the workshops was quite small, and the benefits quite large. Cockburn (2002) echoed this same idea and explained that the after process reviews were helpful in growing the skill of the team and improving the skill sets of the participants. A review at the end of the development cycle where the participants shared their experiences significantly enhanced the development process (Dingsøyr & Hanssen, 2003). Thus, reflection builds the ability of a software development team to innovate by improving and optimizing the team practices.

Within each of the principles behind agile software development are theories of value creation for a software development team. By allowing the customer to set the measures of success within the product, brings value creation by building trust. The ability to adapt to changing requirements allows the development team to innovate and meet the customer’s needs. Quick and often delivery of a working product, even if it is not complete, builds value for the customer and the development team as they both gain credibility. Constant communication with management helps to build trust in the team and give them the freedom to innovate and create value for the organization. Closely related to this theory is the need for management to be able to measure progress by measuring the completeness of the project. By using the completeness of the project for measure the management is able to see progress and eliminate additional paperwork for the developer, which freed them to write more code. Also related to management was using other measures instead of cost and schedule as they would detract from the real goals of the project. Another way that value is created is through the support of management with proper training and tools, which brings about excellence. The adaptable and self-organizing team, a difficult goal to reach, also brings about value creation by allowing the team to adapt to the fluid situations found in software development. And lastly, one of the important and relatively inexpensive ways that a software development team can create value is by reflecting regularly on the development process and integrating the lessons learned, thus constantly improving the ability to innovate.

Value Creation Theories and Outsourcing Of Software Development

So far the exploration of value creation has focused on software development teams in local settings. The additional factor of distance between the development team and the customer or another development team, or even members of the same team presents some additional factors for innovation as well as failure. A review of the current literature shows much disagreement about the benefits and potential for success when outsourcing the development of software. Some of the theories, both pro and con, related to value creation, agile development and outsourcing are explored below.

Effects on Many Levels

Outsourcing of software development in general creates new opportunities for value creation, but also brings many challenges. A recent paper by Hersleb and Moitra (2002) explained that the separation of the software development team over the globe could add many problems. These problems included how the project manager divided up the work and how they handled resistance to the process. They further added that many cultural issues, including the attitude towards management, perceptions of time and communication styles all contributed to the successful outsourcing of a project. A paper by Agelfalk, Fitzgerald, Holstrom, Lings, Lundell and Conchuir (2005) explained that the process of communicating between team members, coordination of activities and control of the project were all challenged by distance. The authors further explained that only when strong supporting processes were in place could the outsourced project work. Hence, the challenges of distance, communication, culture and command and control in an outsourced software development project must be addressed with strong supporting principles and methodologies, like agile, to support value creation.

Even if an organization uses agile methods, because of the focus of the processes for dealing with ambiguity, change and communication, to create value in an outsource project, there are still many challenges that must be overcome. According to research by Carmel and Agrawal (2002) the authors identified three critical challenges of outsourcing software development as: (1) coordination, (2) control and (3) communication. Coordination was defined as integrating tasks across each unit so they all contribute to the whole. Control was defined as following the goals, policies and standards of the organization. Communication was defined as the exchange of information that is understood by those communicating. Thus, an understanding of how to deal with these three critical challenges is required to achieve value creation when outsourcing. Each of these challenges to outsourcing of software development, within the context of agile development principles, is explored further below.

Coordination

The division of tasks when outsourcing software development can impact the ability of an organization to create value. According to Shrivasta and Date (2010) agile teams that were distributed across too wide of a time zone difference suffered from poor performance as they had little overlapping time to coordinate activities. One possible solution to coordination problems suggested by the research of Carmel, Espinoza and Dubinsky (2010) was handing off the work from one site to the next toward the end of the work day going around the globe in the direction of the sun. The authors admitted that this solution was still not entirely proven, yet did present a method that might help in the coordination of distributed software development teams. Another possible solution to the problems of coordination suggested by the research of Wahyudin, Matthias, Eckhard, Schatten and Biffl (2008) was the use of a notification software tool that supported the agile development methodology and managed the interdependent tasks, which gave the project manager and the team members a way to coordinate activities. Hence, the coordination of tasks by the project manager and team members must be managed well to achieve value for the software development project.

Control

Adherence to organizational policies, goals and standards can impact value creation when outsourcing software development. In his research of outsourcing strategies Jennings (1997) explained that one of the most important factors for successful outsourcing was protection and development of the core capabilities that gave an organization their competitive edge. According to research by Sutherland, Schoonheim, Rustenburg and Rijk (2008) the authors found that exceptional productivity, and therefore value creation, in a development project among distributed teams was possible when the teams fully integrated the agile method into their development teams. This was achieved by bringing the teams together, instilling the agile goals and methods and then separating them. The authors acknowledged that the time spent instilling the agile principles and philosophy was a major contributing factor for success. Therefore, instilling the relevant goals and standards, especially the principles of agile development as mentioned above, gives a competitive edge, contributes to the success of outsourced projects and helps build value.

Communication

The most important factor for value creation when outsourcing software development is communication. Research conducted by Sutherland, Viktorov , Blount and Puntikov (2007) showed that communication, particularly when crossing cultures presented a significant obstacle as it limited productivity. The authors further explained that the solution that worked best was a full integration of the agile teams with members distributed around the globe as opposed to teams divided by geography. They argued that although this method slowed down the development process some when compared to an agile project done in a local space, it increased communication and built trust among the team members. A recent research paper by Shrivasta and Date (2010) concluded that knowledge management and communication were among the major problems encountered when software development was outsourced. They proposed an interesting solution were the agile teams used a web-based knowledge management wiki to assist in the capture of experiences. They further suggested that teams should still be brought together at different times in the development process and work together to build trust. Hence, a solid plan for communication among distributed software development teams is required to achieve value creation.

No Easy Answers

Agile software development philosophies and methods provide many opportunities to create value in software development. Much of the research into agile shows the potential for value creation when an organization is willing to embrace the philosophy and create a culture that supports and celebrates innovation. The process of embracing the philosophy and building the culture will take time and training on all levels of an organization to see the process happen. There is still much to be researched and solved to realize true value creation when outsourcing software development. Agile methods hold much promise, but still face many challenges to make the process work with teams spread across the globe. For an organization that already uses agile development principles, outsourcing might create additional value. For an organization that already has outsourced projects, adding agile development principles might also increase innovation. However, an organization with weak development procedures would risk much by trying to add agile principles and outsource at the same time and are likely to reduce instead of improve value creation.

References

Ågerfalk, P. J., Fitzgerald, B., Holmström, H., Lings, B., Lundell, B., & Conchúir, E. (2005). A framework for considering opportunities and threats in distributed software development. InInternational Workshop on Distributed Software Development (pp. 47–61). Citeseer.

 

Asproni, G. (2004). Motivation, teamwork, and agile development. Agile Times, IV (1), 8–15.

Augustine, S., Payne, B., Sencindiver, F., & Woodcock, S. (2005). Agile project management: Steering from the edges. Communications of the ACM, 48(12), 85-89.

Begel, A., & Nagappan, N. (2007). Usage and perceptions of agile software development in an industrial context: An exploratory study. In First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007) (pp. 117-125). Presented at the First International Symposium on Empirical Software Engineering and Measurement, Madrid, Spain: ESEM. doi:10.1109/ESEM.2007.85

Boehm, B. (2002). Get ready for agile methods, with care. Computer, 27(4), 64-69.

Boehm, B. (2003). Value-based software engineering. ACM SIGSOFT Software Engineering Notes, 28(2), 1-12.

Boehm, B., & Turner, R. (2005). Management challenges to implementing agile processes in traditional development organizations. IEEE software, 21(2), 30-39.

Boehm, B. W., & Sullivan, K. J. (2000). Software economics: A roadmap. In Proceedings of the conference on The future of Software engineering (pp. 319-343). ACM.

Cao, L., & Ramesh, B. (2008). Agile requirements engineering practices: An empirical study.Software, IEEE, 25(1), 60-67.

Carmel, E., & Agarwal, R. (2002). Tactical approaches for alleviating distance in global software development. Software, IEEE, 18(2), 22-29.

Carmel, E., Espinosa, J. A., & Dubinsky, Y. (2010). Follow the sun workflow in global software development. Journal of Management Information Systems, 27(1), 17-38.

Ceschi, M., Sillitti, A., Succi, G., & De Panfilis, S. (2005). Project management in plan-based and agile companies. Software, IEEE, 22(3), 21-27.

Chau, T., Maurer, F., & Melnik, G. (2003). Knowledge sharing: Agile methods vs. tayloristic methods. In Enabling Technologies: Infrastructure for Collaborative Enterprises, 2003. WET ICE 2003. Proceedings. Twelfth IEEE International Workshops on (pp. 302-307). IEEE.

Chow, T., & Cao, D. (2008). A survey study of critical success factors in agile software projects.Journal of Systems and Software, 81(6), 961-971. doi:10.1016/j.jss.2007.08.020

Cockburn, A. (2002). Agile software development. Boston, MA USA: Addison-Wesley.

Cockburn, A., & Highsmith, J. (2002). Agile software development: The people factor. Computer,34(11), 131-133.

Cohn, M. (2004). User stories applied: For agile software development. Addison-Wesley Professional.

Cohn, M., & Ford, D. (2003). Introducing an agile process to an organization [software development]. Computer, 36(6), 74-78.

Constantine, L. L., & Lockwood, L. A. D. (1999). Software for use: A practical guide to the models and methods of usage-centered design. New York, NY, USA: ACM Press/Addison-Wesley Publishing Co.

Decker, B., Ras, E., Rech, J., Klein, B., & Hoecht, C. (2005). Self-organized reuse of software engineering knowledge supported by semantic wikis. In Proceedings of the Workshop on Semantic Web Enabled Software Engineering (pp. 126-135). ACM.

Dingsøyr, T., & Hanssen, G. K. (2003). Extending agile methods: Postmortem reviews as extended feedback. Advances in Learning Software Organizations, 4-12.

Dyba, T., & Dingsayr, T. (2008). Empirical studies of agile software development: A systematic review. Information and Software Technology, 50(9-10), 833-859. doi:10.1016/j.infsof.2008.01.006

Fitzgerald, B., Hartnett, G., & Conboy, K. (2006). Customising agile methods to software practices at Intel Shannon. European Journal of Information Systems, 15(2), 200-213.

Forward, A., & Lethbridge, T. C. (2002). The relevance of software documentation, tools and technologies: A survey. In Proceedings of the 2002 ACM symposium on Document engineering (pp. 26-33). ACM.

Fowler, M., & Highsmith, J. (2001). Manifesto for agile software development. Retrieved February 6, 2011, from http://agilemanifesto.org/

Glass, R. (2001). Agile versus traditional: Make love, not war! Cutter IT Journal, 14(2), 12-18.

Hanssen, G. K., & Fægri, T. E. (2008). Process fusion: An industrial case study on agile software product line engineering. Journal of Systems and Software, 81(6), 843-854.

Herbsleb, J. D., & Moitra, D. (2002). Global software development. Software, IEEE, 18(2), 16-20.

Highsmith, & Cockburn. (2002). Agile software development: The business of innovation.Computer, 34(9), 120-127.

Jennings, D. (1997). Strategic guidelines for outsourcing decisions. Strategic Change, 6(2), 85-96.

Karlsson, J., & Ryan, K. (2002). A cost-value approach for prioritizing requirements. Software, IEEE, 14(5), 67-74.

Lapham, M. A., Williams, R., Hammons, C., Burton, D., & Schenker, A. (2010). Considerations for using agile in DoD acquisition (Technical Note No. CMU/SEI-2010-TN-002). Hanscom AFB, MA: Carnegie Mellon.

Larman, C. (2004). Agile and iterative development: A manager’s guide. Prentice Hall.

Lindstrom, L., & Jeffries, R. (2004). Extreme programming and agile software development methodologies. Information Systems Management, 21(3), 41-52.

Little, T. (2005). Value creation and capture: A model of the software development process.Software, IEEE, 21(3), 48-53.

Moe, N. B., Dingsoyr, T., & Dyba, T. (2008). Understanding self-organizing teams in agile software development. In Software Engineering, 2008. ASWEC 2008. 19th Australian Conference on(pp. 76-85). IEEE.

Nerur, S., Mahapatra, R. K., & Mangalaraj, G. (2005). Challenges of migrating to agile methodologies. Communications of the ACM, 48(5), 72-78.

Nonaka, I., & Takeuchi, K. (1995). The knowledge creating company: How Japanese companies create the dynamics of innovation. Oxford, UK: Oxford University Press.

Paetsch, F., Eberlein, A., & Maurer, F. (2003). Requirements engineering and agile software development. In Enabling Technologies: Infrastructure for Collaborative Enterprises, 2003. WET ICE 2003. Proceedings. Twelfth IEEE International Workshops on (pp. 308-313). IEEE.

Patton, J. (2002). Hitting the target: Adding interaction design to agile software development. InOOPSLA 2002 Practitioners Reports (p. 1). Presented at the Object-Oriented Programming, Systems, Languages, and Application Conference, Seattle, WA: ACM.

Polanyi, M. (1966). The tacit dimension. London: Routledge and Kegan Paul.

Poppendieck, M., & Poppendieck, T. (2003). Lean software development: An agile toolkit. Addison-Wesley Professional.

Prahalad, C. K., & Ramaswamy, V. (2004). Co-creation experiences: The next practice in value creation. Journal of Interactive Marketing, 18(3), 5-14.

Quinn, J. B., Baruch, J. J., & Zien, K. A. (1996). Software-based innovation. The McKinsey Quarterly, (4), 94-96.

Salo, O., Kolehmainen, K., Kyllönen, P., Löthman, J., Salmijärvi, S., & Abrahamsson, P. (2004). Self-adaptability of agile software processes: A case study on post-iteration workshops.Extreme Programming and Agile Processes in Software Engineering, 13(2), 184-193.

Shrivastava, S. V., & Date, H. (2010). Distributed agile software development: A review. Journal of Computer Science and Engineering, 1(1), 10-17.

Sutherland, J., Schoonheim, G., Rustenburg, E., & Rijk, M. (2008). Fully distributed scrum: The secret sauce for hyperproductive offshored development teams. In Expanding Agile Horizons (pp. 339-344). Presented at the Agile 2008 Conference, Toronto, Canada: IEEE.

Sutherland, J., Viktorov, A., Blount, J., & Puntikov, N. (2007). Distributed scrum: Agile project management with outsourced development teams. In Information Technology in Health Care (pp. 274-284). Presented at the 40th Hawaii International Conference on System Sciences, Waikoloa, Big Island, Hawaii, USA: IEEE Computer Society. doi:10.1109/HICSS.2007.180

Wahyudin, D., Heindl, M., Eckhard, B., Schatten, A., & Biffl, S. (2008). In-time role-specific notification as formal means to balance agile practices in global software development settings. In Lecture Notes in Computer Science (Vol. 5082, pp. 208-222). Springer.

Wendorff, P. (2002). Organisational culture in agile software development. Lecture Notes In Computer Science, 17(2559), 145-157.