
Client
-
Team members
Tan Ler Shan, Danikh Nizam
Year
2025
p5.js, Figma, GitHub
Design Brief:
This platform explores 2D codes as physical objects, interaction interfaces, and service touchpoints. By examining how materiality affects scanability, how our bodies engage with these markers, and how they mediate trust in digital systems, students learn to both familiarise and defamiliarise these ubiquitous codes. The aim is to creatively reinterpret barcodes and QR codes to propose new interaction paradigms, services, or ways of producing them, culminating in proof-of-concept prototypes that critically reimagine their role in interaction, service, and systems design.


DECODE breaks down the everyday QR code to reveal the hidden logic inside a piece of ubiquitous technology.

There are seven interactions at DECODE, ranging from scanning a code as usual, to revealing hidden components, to assembling physical and digital segments together. The exhibition spans a spectrum of interaction types, including tactile physical engagements, purely digital experiences, and phygital moments where both worlds meet. DECODE offers a wide range of encounters for anyone who has used QR codes for years yet never paused to consider how they actually work.

Across the seven stations, we explored the QR code as an interactive artefact by manipulating its pixel structures, altering key functional components, and prototyping with unconventional materials such as transparent acrylic. By layering new forms of interaction onto the simple act of scanning, we reframed the QR code from a passive data container into an active site of learning, discovery, and algorithmic literacy.


Each station is built around a custom template designed by us. The template consists of a 3D-printed holder that secures the acrylic piece and provides a consistent way for users to handle the QR artefact. We laser engraved different variations of QR codes, including full codes, half codes, and isolated markers, to highlight specific components and manipulations. One station, Contrast, includes an additional sliding component that lets users move the QR code across different backgrounds to observe how contrast influences its readability and scannability.

Each station starts with a locked site that prompts the user to participate in the interaction to unlock the site and continue to the next station.

For this particular station, “Universality,” users engage in a physical–digital interaction where they must use the provided webcam to scan a series of QR codes displayed on screen. The twist is that these on-screen codes are intentionally missing their three position markers. Users are given an acrylic artefact with the markers laser engraved, which they must align with each digital code. This introduces an additional layer of phygital interaction, requiring users to bridge the physical and digital components to complete the scan. Each successful scan triggers a fireworks animation and causes the QR code to disappear. Once all QR codes have been successfully scanned, the final QR code to unlock the site appears.

Scanning the final QR code directs the user to the unlocked site, completing the interaction loop. This closing step reinforces the relationship between user action, system response, and progression, demonstrating how physical augmentation can restore functionality and enable access within a digitally constrained environment.


All 7 stations share the same overarching goal: to reach the “successfully scanned” page by observing, experimenting, and uncovering the logic embedded in each task. The unlocked page reveals the underlying terminology and principles behind the specific QR code manipulation used in that station, providing a concise explanation of why the interaction works the way it does.


These interactions across the seven stations each highlight a distinct feature of QR codes, ranging from basic scanning requirements to the structure of their underlying algorithms. The sequence is intentionally curated, beginning with fundamental components, progressing towards more abstract concepts, and concluding with the QR code’s relationship to us as users. Through these simple yet deliberate interactions, each feature is foregrounded and made tangible, allowing you to experience the concept first-hand.


Station 1, “Contrast,” explores how background conditions influence the scanability of a QR code. Users are given a 3D printed template holder and a transparent acrylic piece with the unlocked site QR code laser engraved onto it. The slider contains two windows, allowing users to move the acrylic QR code across different backgrounds, varying in colour, texture, and material. To progress, users must identify the background that provides sufficient contrast for the scanner to recognise the code. This station highlights a fundamental requirement of QR technology: without clear visual contrast between the code and its surface, the algorithm cannot reliably decode it.


Station 2, “Error Correction,” introduces a scratchcard style interface where users erase a digital layer to gradually reveal the QR code beneath. This interaction demonstrates the error correction capability built into QR codes, where up to 70 per cent of the code can be obscured or degraded and still remain readable. As users remove more of the surface, they experience first-hand how much visual information the scanning algorithm can recover, highlighting the robustness of the QR code’s design.


Station 3, “White Space,” focuses on the quiet zone that surrounds every QR code, which is essential for the scanner to distinguish the code from its background. Users must repeatedly press the spacebar to expand the quiet zone around the code. If they stop pressing or do not press quickly enough, the quiet zone begins to shrink and the QR code becomes unscannable. This interaction highlights how critical the white space is for successful decoding and how easily scanability is compromised when this boundary is insufficient.


Station 4, “Uniqueness,” presents a wall of half QR codes where users must identify the correct match for the half they hold. Using the previously introduced template and a laser engraved acrylic half code, users compare their fragment against the displayed options to locate the exact counterpart. This interaction highlights the uniqueness of each QR code and how even visually similar halves can be incorrect. Users are encouraged to examine structural details, including whether their fragment contains one or two position markers, as these features provide important clues. The station emphasises that every QR code is distinct, and only one precise half will complete the original whole.


Station 5, “Universality,” presents a set of deconstructed QR codes on screen, each intentionally missing all three position markers. Users return to the physical template and use their acrylic piece, which contains only the three laser engraved markers. By aligning these universal markers with each markerless QR code, users complete the structure required for scanning. Unlike Station 4, where the acrylic half code matches only one specific counterpart, the three markers here are universal and will work with any QR code as long as they are correctly aligned. A notable inversion also takes place during this interaction: in typical scanning, the screen on the smartphone changes, but here it is the screen being scanned that responds. Each successful alignment causes the QR code to disappear with a fireworks effect. Once all codes have been completed, the final QR code appears to lead users to the unlocked site.


Station 6, “Digital Eyes,” is divided into two parts. The first presents a series of QR codes that look visually different to us as humans, sometimes drastically so, yet all encode the same data that leads to the unlocked site. While our eyes focus on surface appearance, the scanner’s digital eye reads only the underlying structure and information, revealing how QR codes can vary dramatically in form while remaining functionally identical. The second part offers an interactive breakdown of how QR codes work, guiding users through concepts such as error correction, Reed–Solomon coding, position markers, data modules and other mechanisms that underpin the scanning algorithm. Together, these interactions highlight the contrast between human perception and digital interpretation, offering insight into how QR codes are actually read and reconstructed.


Station 7, “Meaning-Making,” examines how humans rely on contextual cues and call-to-action prompts to interpret what a QR code is meant to do. While scanners read only the encoded data, we depend on surrounding text such as “scan to order” or “scan for menu” to understand the intended action. To highlight this contrast, the station presents a series of posters that deliberately subvert these expectations. A poster may instruct users to scan for a menu, yet the QR code leads elsewhere, prompting users to notice how heavily meaning is constructed through context rather than the code itself. This interaction foregrounds the role of human interpretation in QR use and invites reflection on how much we depend on these cues to navigate everyday digital interactions.





