After wrapping up a conference call in an airline lounge at Heathrow Airport, Jennifer powered down her laptop and joined her family for their flight back to Houston. As Jennifer scanned her boarding pass at the gate, an alert popped up asking that she speak to a flight attendant. It wasn’t a last-minute upgrade, however. Instead, the airline reported that a service rep was on the way to return Jennifer’s lost phone.
“Huh?” Jennifer asked herself while rifling through her bag. Sure enough, the phone had gone missing somewhere between the conference call and the departure gate. But how had the carrier identified Jennifer as the phone’s owner? Her name wasn’t displayed externally, and the locked screen showed only the time and date, superimposed over a photo of her two daughters.
Mystified, Jennifer mentally retraced her steps. She had used the airline app to display her boarding pass for security clearance and to make a duty-free purchase. She also used her rideshare app to prebook her ride home in Houston. She then used her airline app to enter the frequent flyer lounge, where the phone was found. In the end, each of these actions created digital breadcrumbs that, when combined with the photograph of her children that shared her radiant blue eyes and bright red hair, made determining ownership of the device a relatively simple deduction.
As Jennifer boarded the plane, she gratefully tucked her phone into her handbag. Yet she couldn’t dismiss one nagging concern: Was the airline’s ability to track her morning at Heathrow a time-saving convenience – or an invasion of privacy?
It’s a question that today’s commercial real estate (CRE) companies will need to consider as they continue to embrace the idea of smart or intelligent, high-performance buildings. Business leaders will need to carefully balance tapping into intimate information about how we live, work, and play while ensuring that personal privacy is protected.
Buildings with smart technology use an ever-expanding constellation of sources that includes connected sensors, biometric devices, social media feeds, beacons, facial recognition, and, of course, ubiquitous smartphones. The goal is operational efficiency, financial optimization, energy savings, and tenant retention. In addition, as retail market pressures and coworking facilities shift the dynamics of the traditional CRE landscape, it has become increasingly important to create an ecosystem of experiences and conveniences deduced from the tendencies and preferences of occupants.
Collecting operational data from IT systems and building equipment is expected. And insights gleaned from foot traffic and consumption of resources rarely intrudes on personal privacy. It’s when building operators and owners start to use personal information about the activities of people within their buildings – such as job titles, social media screen names, calendars, favorite foods, and email and IM communications – that use of these digital breadcrumbs to deliver personalized experience can get tricky.
For instance, sensors can be used to identify a person and automatically set the room temperature to the person’s preference, connect the computer to the LCD display, and alert others to her location. This level of tracking is accepted by most as a worthwhile personalized experience. But many people would draw the line if that same level of surveillance followed them into the building’s bathroom.
Artificial intelligence (AI), in particular, has raised concerns. A lack of transparency into the design of AI algorithms can allow the technology to unintentionally discriminate against certain groups, such as ethnic and sexual minorities. Great care must be taken to ensure that the technology does not result in violations of privacy and civil liberties.
That potential is compounded when AI is coupled with biometric technologies like facial recognition to identify – and sometimes exclude – certain individuals. For instance, some retail stores use facial recognition to match images captured from surveillance cameras with photos of known shoplifters. In New York, a Brooklyn landlord’s plan to deploy facial-recognition technology in its residential buildings was met with tenant resistance after residents cited potential violations of civil liberties.
To avoid this type of pitfall, the data that businesses use to “train” AI systems should adhere to ethical principles and protect human rights. Doing so will require that people who design the systems understand the potential ethical risks and write algorithms that are not biased in regard to age, gender, race, sexual orientation, and other characteristics. Beyond this, businesses must be transparent about how data is used and provide consent for use of personal information. Also necessary: Effective cybersecurity and data-privacy safeguards to help ensure that sensitive data doesn’t end up for sale on the dark web.
All of this will require a culture founded on transparency, data privacy, and a commitment to human rights. When individuals enter a building that uses technology to track behavior, businesses should send a consent notice to their smartphone before any data is collected. For now, this will be a self-determined effort, since there is no global standard regarding the use of AI and data.
Data can help owners and operators of office buildings foster a culture of community and connectedness, which can encourage collaboration and boost productivity and worker satisfaction. Similarly, data collected in spaces for shopping and recreation can help customers navigate facilities, recommend products and deliver contextual coupons, and even capture smartphone search histories to determine if customers shopped around during their visit.
Whether for work, shopping, or play, this type of personal information can help organizations deepen digital intimacy and create a more connected human experience. Businesses will need to gather data and personalize the experience in ways that foster trust and transparency, and build relationships that can ultimately result in better tenant retention and user experience.
Governments around the globe are responding to the rush of data collection with tighter data-privacy regulations. Most notable is the EU’s General Data Protection Regulation (GDPR), a sweeping data-privacy law that went into effect in 2018 and is designed to protect the personal data of EU citizens by giving users more control over how their information is used.
The GDPR is applicable to any organization that collects, stores, processes, transmits, or uses personal data of EU residents. The regulation stipulates that tenants and landlords must inform occupants about what data is collected and provide clearly worded consent agreements for use of personal data.
Lacking a federal law on data privacy, some U.S. states are introducing their own regulations. California recently enacted the California Consumer Privacy Act to safeguard its citizens’ personal data. The law, which went into effect Jan. 1, 2020, requires that organizations fully disclose the collection and use of data. On the East Coast, the New York Department of Financial Services has implemented 23 NYCRR 500, a regulation that imposes strict new requirements for protection of client information.
Many real estate players are compiling huge volumes of data without a clear idea of how they will use it. Those that do plan an analytics initiative often focus on what they can do with data, not what they should do.
Organizations should adopt a humanistic approach to data that balances individual privacy with corporate profitability. The commitment to data privacy should be embedded in the organizational DNA and instilled as a shared ethical value, much like businesses have embraced sustainability as a pillar of corporate responsibility.
In other words, privacy is everybody’s business. What’s needed is an inclusive Privacy by Design approach in which privacy is embedded into the fabric of IT infrastructure and business processes, and overseen by a chief data officer who is accountable to the C-suite and board.
The first step will be to design and implement an up-to-date data-governance program. Organizations will need to identify and catalog the data they collect, store, transmit, process, and use. Data should be classified based on timing and its current state, and tagged to help better pinpoint information for analytics.
More than that, an effective approach to data governance will require that organizations adhere to new precepts like data minimization, which curbs the potential for privacy violations by limiting the collection of personal data. It’s not a check-the-box mindset, nor is it purely a technical approach. What’s needed is a holistic respect for data privacy that factors in the human context and is expressly framed in terms of personal rights.
CRE firms that adopt a humanistic mindset toward data will be better prepared to build a customer-centric business based on transparency and trust. So when a person realizes her digital breadcrumbs are being tracked for beneficial reasons, as Jennifer learned on her trip home, there’s no creepiness factor. Instead, individuals feel good about technology-driven personalization.
CRE Digital & Human Experiences
Monetizing Data: The Migration From Physical Assets to Digital Assets