At the Delft Design for Values Institute we use ‘Design for Values’ as an umbrella term that encompasses a diversity of design approaches, theoretical backgrounds, considered values, and application domains (see our introduction on design for values). Below we list some examples.
Exploring the multi-faceted relationships between design and democracy in a rapidly changing world
Some have argued that technology can be a threat to democracy (e.g. because certain energy technologies require a central system of control), others have pointed out that technology may advance democracy (e.g. information technology supporting the Arab Spring). Both recent developments (such as micro-targeting voters on social media platforms) and future promises (such as the smart city) make design for democracy more relevant than ever. Democracy is also related to various other values, such as justice and transparency. Various researchers at Delft University of Technology work to integrate theories on the value of democracy and insights into empirical realities affecting democracy into the process of design and innovation, in order to advance democracy.
How products and services can help us act in ways that benefit society
Products that support us in fulfilling our needs can also have a significant yet implicit influence on our behavior. In social design we try to understand and take into account this influence of design on human behavior from a social perspective, with an eye for both short and long term effects. Values differ per project, they can for example be: equality between men and women, healthy living, sustainability, or animal–welfare. A framework has been developed for designers that helps them to deconstruct social problems to clear conflicts between personal and collective concerns. Design offers various strategies for dealing with these conflicts effectively.
Putting human values at the core of AI systems that become increasingly autonomous
Developments in autonomy and learning are rapidly enabling AI systems to decide and act without direct human control. Algorithm development has so far been led by the goal of improving performance, leading to opaque black boxes. But greater machine autonomy must come with greater responsibility and the capacity to explain decisions. Trust in AI must be based on transparency, since humans cannot relate to the ways of doings of machines. Putting human values at the core of AI systems calls for a mind-shift of researchers and developers towards the goal of improving transparency rather than performance, which will lead to novel and exciting algorithms. Turning Deep Learning into Valuable Learning.
Designing products and services that increase the well-being of individuals and communities
Technical innovations have increasingly become infused with design. But not necessarily contribute to the well-being of individuals or society. Positive design is about the responsibility as design researchers to generate knowledge that enables designers to formulate effective strategies in contributing to the happiness of people. It should not only help designers in their attempts to deliberately design for meaningful product-user relationships, but ultimately also to design products that contribute to a healthy society: to make the world a better place. The Delft Institute of Positive Design has been established in 2011 to support designers in their efforts to design responsibly.
Creating an economy in which today’s products are tomorrow’s resources
Our global society is not sustainable. We are facing grand challenges: waste, climate change, resource scarcity, loss of biodiversity. At the same time we want to sustain our standard of living and offer opportunities for a growing world population. The concept of the circular economy provides solutions. Circular product design explores and develops design methods and strategies for product lifetime extension, reuse, re-manufacturing and recycling, as well as the business models that enable these strategies. Designing for the circular economy helps businesses to create value and contributes to a sustainable society.
Smart, responsible technologies that help cities becoming more sustainable
Technological development changes our cities: the way we live, work, plan our living environment and the way we organize ourselves. In Smart Cities the digital and physical world come together. The concept brings together new technologies, networks and infrastructure to help making cities future-proof and improve quality of life. The role of big data, privacy and sustainability issues, but also the history of urban planning are all relevant aspects in building smart cities.
Dealing with ethical and responsibility issues in behavioral change strategies used in society
Persuasive Technology is a vibrant interdisciplinary research field, focusing on the design, development and evaluation of interactive technologies aimed at changing users’ attitudes or behavior through persuasion and social influence, but not through coercion or deception. An example of persuasive technology is an intelligent decision support system for negotiation. Persuasive technologies are used to change people’s behavior in various domains such as healthcare, sustainability, education or marketing. It is a way to realize values such as health, sustainability and quality of life.