Contact Scott Sedam via e-mail at scott@TRUEN.com
|
By the time you read this, the 2003 J.D. Power survey results on home buyer satisfaction will be public. From across the land, we hear shouts of joy, cries of anguish, gnashing of teeth and articulate expressions of denial. Last year J.D. Power rated builders in 16 cities, and this year it’s measuring more than 20. It’s getting more and more difficult to hide, and customer satisfaction has become a high-stakes game.
To refresh your memory, J.D. Power surveys an entire market, pulling names and addresses of new home buyers from public records. The surveys go to the customers of all home builders that produce enough units to qualify, not just the ones that subscribe to J.D. Power’s services. It’s a huge undertaking, and through this process J.D. Power captures the customer experience from buyers of all major and midsize builders. It’s a little hard to determine where the cutoff is, but depending on the market, if you build more than 100 homes, you likely will show up in the final ranking report. Just about everyone building more than 200 units gets ranked.
If you simply go to www.jdpower.com, you can see the latest results for your city, presuming it has been surveyed. (By the way, if J.D. Power has measured your city before and this is the first time you’ve gone to its Web site to study the results, consider yourself chastised.)
The way J.D. Power presents its home builder rankings this year is clever, although perhaps not entirely fair. It takes all of the builder total scores in a single market and strikes a mean. If a company is above the mathematical average, it’s listed in order of placement. Any company below the mean is consigned to the "below-average dungeon." Let’s say 50 builders are rated, 25 above the mean, 25 below the mean, and yours is 26th. The only thing you - and the public - will ever know is that your company is below average. In our society, below average translates into "you stink."
Now it’s true that you can buy your detailed J.D. Power report (last I heard, that costs more than $30,000) and find out exactly where you stand. I suppose you could turn that into compelling advertising copy, as in, "We’re below average - but not as bad as you think!" But the damage is done. The consumers have the message, and any prospective home buyer who visits a model of any builder finishing in the top five is sure to walk away with a printout of the rankings. Presuming those prospects still decide to visit your model, your salespeople likely will have some explaining to do.
Where’s the Value?
Is the J.D. Power survey a good thing? I have mixed feelings about that. The survey has been a prime mover in the campaign to convince builders they can no longer ignore customer opinion. As proof of that, consider that in 1990 almost no one in the industry had a professionally managed and reported customer satisfaction survey. Now, barely more than a decade later, nearly everyone has such a survey. Customer satisfaction now is considered a key performance indicator for both individuals and organizations. And let’s face it, J.D. Power, Harris and Gallup are the names most familiar to consumers. And because Harris and Gallup aren’t in the business of selling their "seal of approval" (using your J.D. Power Trophy in an ad costs big bucks), J.D. Power pretty much owns the franchise. It isn’t going away.
I have seen pretty interesting reactions to the publishing of J.D. Power results in the past few years. In a few cases, it was received as a wake-up call, leading to major changes in how a builder does business. I also have witnessed a couple of terrific cases of denial. One builder tried to convince me that the top three spots are available to the highest bidders - something that everyone, of course, knows! I asked him if he really believed that J.D. Power would risk its entire reputation and future by selling the top rankings. This doesn’t even consider the intricate level of collusion required among the top scorers and J.D. Power to come up with who finishes where and for how much. In short, a totally preposterous notion, yet I have heard the idea floated more than once.
I am perhaps most curious about the several large national builders that have done consistently poorly in the survey in virtually every one of their locations measured. One well-known GIANT in particular has yet to finish above average a single time in a single city! This, I suggest, is hard to do. Being measured time and again and finishing below the mean each time sounds impossible. Surely, one of its locations has this customer thing figured out enough to be in the top 50%. I wonder what the officers of this company say to each other. Are they angry? Are they in denial? Are they making sincere efforts to remedy the problems or just sending out improve-or-die memos? Has this ever come up in a board of directors meeting? I’m guessing it has not.
Playing the Game Differently
One national firm, Pulte Homes, finished far and above any other last year on a national basis. In this issue, in a totally separate customer satisfaction study, two divisions of the company, Pulte Phoenix and Pulte Minnesota, took the top spots among high-volume production builders.
Pulte’s home buyers consider the company to be the best, but do you? Is this a firm to be admired and emulated, or do you rationalize some explanation for how it consistently scores so well? J.D. Power doesn’t officially do this calculation, but you can figure it out: Just go back to its Web site and count how often the various national names show up in the top 10 for each city J.D. Power measured last year. Assign some point value to their rankings, add the points and then divide. If you set Pulte as a 10 on your 1-10 scale, the next-highest national builder would be about a 3. There is no comparison. No one is even close.
When I saw the rankings a year ago, I thought this would be a big deal. I thought I’d see it written about, brought up at conventions and industry meetings - and I thought I’d get phone calls from competitors asking me if I knew how Pulte did it. Hardly a peep from anyone, I have to report. This would be roughly equivalent to a sprinter winning Olympic gold in the 100 meters by a full second and no one expressing any curiosity about it.
Now, I write this not knowing how things will turn out in the J.D. Power rankings this year. But my forecast is that Pulte again will smoke the field on a national basis. Sure, there are excellent local and regional builders that will compete with Pulte and even beat it on a city-by-city basis. But among those that build nationwide, I predict Pulte will retain the crown.
The obvious question is: Why? Many of the answers are in this issue, and we’ll continue our exploration next month. But for the sake of your learning, please do some soul-searching with your team first. Look up who finished in the top five in your city and then ask yourself, your people, your suppliers and trades: What do these builders do that is different from what we do? Write it down. Make a specific, definitive list in black and white. Next month we’ll compare lists. That should be interesting.