The new maintenance coordinator at a Dallas apartment complex received praise from tenants and co-workers for his good work and nightly help. Previously, the eight people who managed the building’s 814 apartments and townhouses were overworked and working more hours than they wanted.
In addition to working overtime, the resort’s newest staff member, Cypress Waters District, is available 24/7 to schedule repair requests and takes no time off.
That’s because the maintenance coordinator is an artificial intelligence robot that property manager Jason Busboom started using last year. The bot, which sends text messages under the name Matt, takes requests and manages appointments.
The team also includes Lisa, the rental robot who answers questions from potential tenants, and Hunter, the robot who reminds people to pay their rent. Mr. Busboom chose the personalities he wanted for each AI assistant: Lisa is professional and informative; Matt is friendly and helpful. and Hunter is stern, having to appear authoritarian when reminding the tenants to pay their rent.
Technology has freed up valuable time for Mr. Busboom’s human staff, he said, and everyone is now much happier in their jobs. Before, “when someone took a vacation, it was very stressful,” he adds.
Chatbots, along with other AI tools that can track common space usage and monitor energy consumption, assist with construction management, and perform other tasks, are increasingly common in property management. The money and time saved by new technologies could generate $110 billion or more in value for the real estate industry, according to a 2023 report from the McKinsey Global Institute. But advances in AI and its catapult into the public consciousness have also raised the question of whether tenants should be informed when interacting with an AI robot.
Ray Weng, a software programmer, learned he was dealing with AI rental agents while looking for an apartment in New York last year, when agents for two buildings used the same name and gave the same answers to his questions.
“I prefer to deal with a person,” he said. “It’s a big commitment to sign a lease.”
Some of the apartment tours he did were self-guided, Mr. Weng said, “and if it’s all automated, it feels like they don’t care enough to have a real person talk to me .”
EliseAI, a New York-based software company whose virtual assistants are used by owners of nearly 2.5 million apartments in the United States, some of which are operated by property management company Greystar, is working to make its assistants as human as possible, said Minna Song, EliseAI’s chief executive. In addition to being available via chat, text and email, the bots can interact with tenants by voice and can have different accents.
Virtual assistants that help with maintenance requests can ask follow-up questions, such as checking which sink needs to be fixed in case a tenant isn’t available when the repair is done, Ms. Song said, and some are starting to help tenants troubleshoot maintenance issues on their own. Tenants with a leaking toilet, for example, can receive a message with a video showing them where the water shutoff valve is and how to use it while they wait for a plumber.
The technology is so good at initiating a conversation and asking follow-up questions that tenants often mistake the AI assistant for a human. “People come to the rental office and ask for Elise by name,” Ms. Song said, adding that tenants texted the chatbot to meet for coffee, told managers that Elise deserved a raise and even dropped off gift cards for the chatbot.
Not telling customers that they interacted with a bot is risky. Duri Long, an assistant professor of communications studies at Northwestern University, said that could cause some people to lose confidence in the company that uses the technology.
Alex John London, a professor of ethics and computer technology at Carnegie Mellon University, said people might view this deception as disrespectful.
“All things considered, it’s best for your robot to announce early on that it’s a computer assistant,” Dr. London said.
Ms Song said it was up to each company to monitor evolving legal standards and think about what they were saying to consumers. A large majority of states do not have laws requiring disclosure of the use of AI to communicate with a human, and the laws that do exist primarily relate to influencing voting and sales, so a robot used for maintenance scheduling or rent reminder would not need to be disclosed to customers. (The Cypress Waters District does not tell current and prospective tenants that they are interacting with an AI robot.)
Another risk concerns AI-generated insights. Milena Petrova, an associate professor who teaches real estate and corporate finance at Syracuse University, said humans need to be “involved to be able to critically analyze all the results,” especially for any interactions outside of the simplest and most common.
Sandeep Dave, director of digital and technology at CBRE, a real estate services company, said it didn’t help that the AI ”comes across as very confident, so people will tend to believe it.”
Marshal Davis, who manages real estate and a real estate technology consulting company, monitors the AI system he created to help his two office workers answer the 30 to 50 calls they receive daily in a complex of 160 apartments in Houston. The chatbot is good at answering simple questions, like those about rent payment procedures or details of available apartments, Mr. Davis said. But on more complex questions, the system can “answer the way it wants and not necessarily the way you want,” he said.
Mr. Davis records most calls, runs them through another AI tool to summarize them, and then listens for the ones that seem problematic — like “when the AI says, ‘The customer expressed frustration,’” he said — to figure out how to improve the system.
Some tenants aren’t completely sold. Jillian Pendergast interacted with robots last year while looking for an apartment in San Diego. “They’re great for making appointments,” she said, but dealing with AI assistants rather than humans can get frustrating when they start repeating answers.
“I can see the potential, but I feel like they’re still in the trial and error phase,” Ms. Pendergast said.