We demand a world in which technology is created to protect and empower those who are impacted by it. Our technology must respect the rights and freedoms of those users. We need to take control for the purpose of collectively building a better world in which technology works in service to the good of human kind, protecting our rights and digital autonomy as individuals.
We have become more reliant than ever on the technology that we intertwine into every aspect of our lives. Technology is currently made not for us, those using it. Rather, it is for those who intend to monetize its use and own the associated intellectual property. Services are run via networked software on computers we never directly interact with. Our devices are designed to only function while broadcasting our intimate information regardless of whether the transmission of that information is necessary functionality. We generate data that we do not have access to, that is bought, sold, and traded between corporations and governments. Technologies we're increasingly being forced to use reinforce and amplify social inequalities. As schools and jobs go online, high speed computing, centralized services and Internet access become inescapably necessary. Technology is designed and implemented to oppress, often with sexist, classist, and racist implications. Rather than being served by these tools, we are instead in service to them. These gatekeepers of our technology are not individual people or public organizations who think about the well-being of others, but instead are corporations, governments and others with agendas that do not include our best interests. Our technology has become the basic infrastructure on which our society functions, and yet the individuals who use it have no say or control over its function.
It's time to change our digital destiny.
We therefore call for the adoption of the following principles for ethical technology:
From conception to public availability, technology must be in the service of the people and communities who use it and are impacted by it. This includes a freedom from surveillance, data gathering, data sales, and vendor and file format lock-in. When it becomes apparent that the technology, as it is delivered, does not meet the needs of a given person or community, they must be able to change and repair their technology. Technology must have an option for use without connectivity to a network, except where the purpose of the technology is its connectivity.
People must have the ability to study and understand technology in order to decide whether using it as is is the right choice. People must be able to understand, either through their own work or the work of others, how the technology is operating and what information it is collecting, storing and selling. Development processes should be transparent and accessible. Additionally, there should be no punitive responses for declining consent -- practical alternatives must be offered, whether those are changes to the underlying technology or compatible updates from the original provider or from third parties.
Technology needs to be designed for communities, as well as the individuals using it. These communities can be those intentionally built around a piece of technology, geographic in nature, or united by another shared purpose. When people discover that their technology is not functioning in their interest, or that the trade offs to use it have become too burdensome, they must have the ability to change what they are using, including the ability to replace the software on a device that they have purchased if it is not serving their interests and to use the technology while not being connected to a centralized network or choose a different service. The technology should be interoperable with other services and software. Empowering action includes having the ability and right to organize to repair the technology and to migrate essential data to other solutions. Control of essential data must belong to the communities generating the data and relying on it.
Building technology must be done to respect the rights of people, including those of privacy, open communication, and the safety to develop ideas without fear of monitoring, risk, or retribution. These cannot be tacked on as afterthoughts, but instead must be considered during the entire design and distribution process. Services should plan to store the minimum amount of data necessary to deliver the service in question, not collect data that may lay the groundwork for exploitation down the road. Anonymization and regular deletion of inessential data should be planned from the outset. Devices need to have the ability to run and function while not transmitting data beyond that required by its functionality.
We, as individuals, collectives, cultures, and societies, make this call in the rapidly changing face of technology and its deepening integration into our lives. As our connection to digital networks and to one another changes, our technology must support us not alienate us. We must forge our own digital destinies. Forming partnerships between technology makers and those using and impacted by those technologies is necessary to build the equitable, hopeful future we dream of.