I’ve had to replace two small appliances since moving into my home, and I found a good, local store that carries a wide range of products. More important, the salespeople who work there work well for me because (and this won’t be a surprise to anyone who reads this blog), they are good teachers. They answer my questions, help me to see the things I’m not trained to see, and guide me to appropriate decisions by helping me to build my understanding. They are patient and conversational. Their service feels personal — the right amount of scriptedness and unscriptedness.
Appliance replacement # 1 went without a hitch. Browsing in the store (me, my wife) led to teaching (the salesperson), learning (me, my wife), a transaction (me, my wife, the store), a delivery, and use.
Appliance replacement # 2 was going without a hitch, following the same steps only enhanced by good note taking on the store’s part. They remembered us, had a record of our last transaction, and we built off of a growing bank of mutual trust.
Until the bots showed up.
After my purchase the second time, I received an automated email that asked me a question: “Can you take 30 seconds and leave us a quick review?” This was new. It then breezily suggested that “The button below makes it easy.”
While I appreciated that the review task would be easy, I didn’t appreciate the fact that the store was asking for a review before the deal had been fully complete. I replied to the email — not the suggested “easy” button — by saying as much: “let’s wait and see how the delivery turns out.” With that, I received another email that said, “Your message wasn’t delivered because the address could not be found.” I guess they only wanted me to do exactly what they wanted me to do — leave a review.
Later that night, I received a text from “Jenny the chatbot.” She confirmed my appointment, which was great. Then she promised that she would let me know once the driver was 60 minutes away from my house. Also great. I was glad to be able to rely on that hour lead-time as I planned my day.
Today, delivery day, the driver showed up without warning (breaking Jenny’s promise) with a dented appliance (making me really glad I didn’t offer a premature review).
After I spoke to a customer service agent, returned the dented appliance, and rescheduled delivery of a non-dented version, Jenny showed up on my phone again telling me that the driver was “expected to make a delivery in the next 30 minutes.” (Now she was just lying, perhaps trying to bend reality.) This text was followed almost immediately by a text that said, “Your order was delivered!” And she then asked for another rating.
Jenny the chatbot is clearly in the wrong line of work. She should be a comedy app that begins when you turn on your phone (bootup comedy). On a less light note, though, I now have to decide if I want to continue to do business with an organization that is so clearly working against the best intentions of its sales staff and its customers. I’m sure Jenny doesn’t cost as much as a human system or a really effective automated system. But there are hidden costs when you save money with intelligence that makes a previously personal transaction feel . . . artificial.