Exploring the possible impacts of housing tech
This is the second blog in a series following the CaCHE workshop, ‘Towards a fully automated housing system in 2030?’, which took place in London on 9 December 2019. Dr Alison Wallaces reflects on the discussion and gives a summary of her presentation ‘Automated Access? Algorithmic risk-profiling tools as housing market intermediaries’.
What will a fully automated housing market look like? How will advances in housing tech change the housing system over the next decade? And how can we ensure that housing tech challenges, rather than exacerbates existing housing inequalities?
These were some of the themes explored in an engaging afternoon of discussion between practitioners and academics, starting conversations about the potential and the pitfalls of digitally transformed housing futures. The relationship between technology and housing is moving in from the periphery and extending beyond discussions of the built form. It is a subject that housing scholars will have to increasingly engage with to understand the social and economic ramifications of data and tech in global and local housing systems.
Desiree Fields’ overview of the range of technological innovations apparent in the housing system provided an overarching typological framework – digital labour, financialisation, platform logic and surveillance – with which to consider the diverse tech innovations discussed, reflecting some of her and Dallas Rogers’ Housing, Theory and Society paper and Housing Journal podcast (Episode 6).
Within all the presentations, tech and algorithms were primarily used to manage risk: of investment, of renting, of letting or of lending. Whilst innovative start-ups are harnessing technological advances to rebalance the risks for tenants – by identifying bad landlords or enhancing housing activism – most applications support those with existing power in the market.
There were a variety of different types of proptech/fintech examined in Desiree Fields’ and Thomas Wainwright’s early findings from a comparative study of proptech innovation in Australia, Germany, the UK and Australia, with institutional and local variations. Joe Shaw informed us about a real estate data analytics platform providing seemingly objective analysis of neighbourhood investment potential, cutting through market emotions and sentiment, but based upon subjective assumptions, giving a veneer of objectivity and privileging the metrics for third party consumption. Ben Yarrow from Marks Out of Tenancy noted the increased data surveillance across a spectrum that could be intrusive (China’s social credit system) to the mundanely acceptable to many (Alexa or Siri), challenging us to consider how far we will accept data monitoring, scoring, classification and for what purpose. Alastair Parvin presented an automated system to produce housing that splits land and property value to make housing affordable in perpetuity, being assisted by shared documentation and plans and the mysterious (to me) blockchain technology.
I was delighted to talk about how automated risk-profiling mediates housing systems. Technologies first employed in credit scoring in the 1990s, and underpinning the expansion of credit into sub-prime risk-based pricing mortgage market, have now extended to rental markets with less comprehensive but stringent landlord and agent checks used to overcome information asymmetries in the rental transaction, ultimately protecting the landlords’ investment. The rhetorical promise of data technologies to ensure business speed and efficiency, not least in public services subjected to austerity, has seen start-ups and data firms pushing back data frontiers into new domains and critical services such as policing and child protection. Housing is also subject to these new promises. For many private landlords, letting risks have increased with welfare reform but are hard to pool across often limited property portfolios, hence the perhaps understandable impulse to know more about prospective tenant risk. In contrast, some social landlords adopt similar technologies or approaches, ostensibly to prevent tenancies failing but also perhaps to prevent loan covenants being broken and produce a secure rental stream, challenging the founding social purposes of housing providers.
The detail and impact of UK credit scoring has attracted limited scholarly attention, unlike in the US where data is available to hold lenders to account due to the history of racial exclusion from financial markets. So we already know little about any disparate impacts of various mortgage market profiling, but the explosion of data possibilities and computing power to manage a host of new data resources means that credit scoring technologies are evolving beyond traditional credit bureau and adverse debt data. Some people – younger cohorts and new arrivals to the country – are invisible to the financial services industry as they have no credit history. Existing Credit Reference Agencies (CRA) and new start-up firms are adopting open banking technologies that allow access to every transaction in a person’s bank account to assess affordability. Some want access to UCAS data (A level results, type of university, dad’s job etc.), utility data, social media and internet browser data that can be more powerful in predicting the risk of default than financial data alone.
And yet we know little about the balance of benefit and harm embedded in these automated and augmented forms of decision-making that permit or inhibit housing access. Moreover, the evidence base lags technological innovation.
Numerous issues have been raised about algorithms reinforcing bias rather than limiting it, as the representativeness of the data is unclear, and the coding, weighting and testing of variables are hidden from public gaze. Commercial confidentiality can limit what we know about decision making, rendering some decision making unknowable and restricting individual or collective rights. How professional engage with these tools is also uncertain: some staff may be resistant or ambivalent to their use, while others may privilege data over professional judgment. We cannot rely on technological solutions alone to ensure fairness but must examine the whole socio-technical process, looking at how the tools are constructed and deployed, and with what impacts. And how much do people looking for housing know about these systems? Young people in Manchester have already highlighted their concern about being pushed into the poor-end of the private rented sector due to poor credit files. Much energy is going in to developing responsible, explainable and ethical technology, so the potential can be shared. But for the moment we know little about the construction, deployment or impacts of automated risk profiling tools in the housing system, who passes, who fails, with what impacts? The potential to further stratify the housing system means these technologies warrant further scrutiny.
Views expressed by the authors may not represent the views of CaCHE.
 Selbst, A.D. & Powles, J. (2017) Meaningful Information and the Right to Explanation. International Data Privacy Law 7(4),pp. 233-242
 Jaume-Palasí, L. & Spielkamp, M. (2017) Ethics and algorithmic processes for decision making and decision support, AlgorithmWatch Working Paper
 Pasquale, F. (2019) Professional Judgment in an Era of Artificial Intelligence and Machine Learning. Boundary 2, 46(1), pp.73–101.
 Hardy, M. (2017) In Defence of Actuarialism: Interrogating the Logic of Risk in Social Work Practice. Journal of Social Work Practice in the Addictions, 31(4), pp.395–410.
 Selbst, A.D Boyd, D., Friedler, S., Venkatasubramanian, S. and Vertesi, S. (2018) Fairness and Abstraction in Sociotechnical Systems
Date: January 14, 2020 4:54 pm
Author(s): Alison Wallace