I’ve created a clear and user-friendly guide featuring essential design laws and principles for you to learn and apply in your work. As the field of design evolves, this codex will grow with it, regularly updated with new insights and improvements to remain relevant and practical.
Current number of laws and principles: 19
Users' expectations are shaped by their experience with other sites.
The average person can keep about 7 (± 2) items in their working memory.
The time to make a decision increases with the number and complexity of choices.
People judge an experience by its peak and its end.
Objects that are near each other tend to be grouped together.
Colors are represented differently across mediums.
The time to acquire a target is a function of the distance to and size of the target.
The design of an object should suggest how it is to be used.
There is a certain amount of complexity in a system which cannot be reduced.
Aesthetic products are perceived as easier to use.
Productivity soars when a computer and its users interact at a pace (< 400 ms) that ensures that neither has to wait on the other.
People hesitate to quit a strategy due to past investments, even if future benefits don't justify ongoing commitment.
Roughly 80% of the effects come from 20% of the causes.
The tendency to approach a goal increases with proximity to the goal.
Work expands to fill the time available for its completion.
A phenomenon where people are more likely to complete a goal if they have a head start.
Users weigh the cost of their action (effort) against the benefits they will receive.
Items in short supply are perceived as more valuable.
Among competing hypotheses that predict equally well, the one with the fewest assumptions should be selected.
Users' expectations are shaped by their experience with other sites.
Jakob's Law suggests that what people expect when they visit a website is based on their experience with other websites. When designing a website, it's important to remember that people like things to be familiar and easy to use, based on what they've seen and done on other sites.
Jakob Nielsen, a leading expert in web usability, first described this principle. His extensive research in user experience (UX) design underpins this law. It stresses the importance of designing websites that align with what users commonly experience online. Nielsen's work delves into how users' past online interactions shape their current expectations, suggesting that designers should consider these experiences to create more intuitive and efficient websites or applications.
The average person can keep about 7 (± 2) items in their working memory.
Miller's Law indicates that the average human mind is capable of holding around 7 (plus or minus 2) distinct items in short-term memory. This concept is crucial for understanding the limitations of human cognition. It suggests that when information exceeds this 7-item threshold, it becomes harder for people to process and retain. This has significant implications for how information should be presented, particularly in fields like design and education.
T = b log2(n + 1)
where T represents the decision time, n is the number of choices, and b is a constant that varies based on the task's nature.
For instance, in a scenario with 4 options, the decision time can be calculated, considering b as a factor of the task's complexity. 'b' typically varies in different contexts, reflecting how different types of choices influence decision time.
George A. Miller, in his landmark 1956 paper "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information," introduced this principle. This paper is one of the most cited in psychology, greatly influencing fields like design, communication, and education. Miller's work was based on earlier research and observations, and he synthesized these findings to propose the 7±2 rule. His analysis of the processing limits of the human brain has had lasting implications on how information is structured and presented, from educational curriculum to user interface design. His work also spurred further research into human cognitive processes, leading to a deeper understanding of memory, attention, and information processing.
The time to make a decision increases with the number and complexity of choices.
Hick's Law states a simple but significant relationship: as the number and complexity of choices increase, so does the time required to make a decision. This principle highlights how an increase in options can complicate decision-making, leading to potential decision fatigue.
T = b log2(n + 1)
where T represents the decision time, n is the number of choices, and b is a constant that varies based on the task's nature.
For instance, in a scenario with 4 options, the decision time can be calculated, considering b as a factor of the task's complexity. 'b' typically varies in different contexts, reflecting how different types of choices influence decision time.
Hick's Law was developed in the 1950s by psychologist William Edmund Hick. It's rooted in understanding human thought processes. Influenced by Claude Shannon's information theory, Hick's work, along with Ray Hyman's, has been crucial in psychology, human-computer interaction, and design. Their research highlights the effect of choice quantity on decision time. It underscores the significance of having fewer, well-chosen options in different areas, from everyday choices to complex system designs. Hick's Law has evolved to become a fundamental concept in understanding how people interact with their environments and make decisions, reflecting the interplay between human cognition and the structured world around us.
People judge an experience by its peak and its end.
The Peak-End Rule states that people evaluate an experience based on its most intense moment (the “peak”) and how it concludes (the “end”), rather than the average of the experience as a whole. This principle is critical for designing memorable and satisfying interactions, as users will disproportionately remember the high points and the final moments.
The Peak-End Rule was proposed by psychologist Daniel Kahneman, a pioneer in behavioral economics and winner of the Nobel Prize. His research, conducted alongside collaborators like Barbara Fredrickson, examined how people recall experiences and demonstrated that memory is disproportionately influenced by the most intense moments and the ending. This principle has become a cornerstone in the fields of user experience design, customer service, and event planning, highlighting the importance of creating strong peaks and satisfying conclusions to leave lasting positive impressions.
Objects that are near each other tend to be grouped together.
The Law of Proximity is a simple idea from Gestalt psychology that says objects close to each other seem to be grouped together. This is important in how we see and arrange things visually, both in real life and in design. It means that things that are near each other look like they belong together, making it easier for us to understand and organize what we see. This law affects everything from how we read to how we make sense of complicated pictures, showing that where things are placed in relation to each other can change how we see them.
The Law of Proximity came from Gestalt psychologists in the early 20th century. It was part of a bigger idea about how people see and understand shapes and patterns. Gestalt psychology was a big change in how we think about seeing and understanding the world. It affected many areas, including design and psychology. The Law of Proximity, especially, showed us how important it is where things are placed in relation to each other when we look at something.
Colors are represented differently across mediums.
Hexadecimal color codes, often referred to as HEX codes, are integral to web design and digital graphics. They employ a six-character format, or sometimes a shorter three-character version, to combine values from the red, green, and blue (RGB) spectrum to create a wide array of colors. Each pair of characters in a HEX code is associated with one of these primary colors, with the possible values ranging from 00 to FF in hexadecimal terms. This range corresponds to 0 to 255 in decimal numbers. For example, the HEX code #000000 indicates no color, which is black, while #FFFFFF represents the highest presence of all three primary colors, resulting in white. This coding system is highly valued for its accuracy and its consistency in specifying colors across various digital mediums.
The practical application of HEX codes is evident in their widespread use. They allow designers and developers to achieve precise color representation, ensuring that the colors displayed on one device are consistent with those on another. This consistency is vital in maintaining brand identity and visual coherence in digital content. Furthermore, HEX codes are simple to use and universally understood in the digital realm, making them a standard practice in web design. Their versatility and precision make HEX codes an indispensable tool in the digital artist's palette, ensuring that the envisioned design is accurately realized in the digital world.
RGB, standing for Red, Green, and Blue, is a foundational color model extensively utilized in electronic displays such as computer monitors, televisions, and smartphone screens. This model is based on the additive color mixing principle, where various intensities of these primary colors are combined to generate a broad spectrum of visible colors. For instance, within the RGB color space, the code RGB(255, 0, 0) signifies pure red. Here, '255' indicates the maximum intensity of the red component, while '0' is used for both the green and blue components. The versatility of this model allows for the creation of a myriad of colors by adjusting the intensity levels of each primary color, enabling it to reproduce virtually any color that the human eye can perceive.
The RGB model is specifically tailored to align with human visual perception, which discerns color through light. Its effectiveness in mimicking the way we see colors makes it an optimal choice for any device that emits light. By leveraging the RGB model, electronic displays can reproduce a wide range of colors, ensuring that the colors seen on screen are vibrant and closely match real-world colors. This adaptability and accuracy in color representation are what make the RGB color model indispensable in the realm of digital imaging and display technology. It not only enhances the user experience by providing true-to-life colors but also forms the basis of digital color interaction and design.
CMYK, standing for Cyan, Magenta, Yellow, and Key (black), is a color model that is primarily used in the printing industry. This model is fundamentally different from the RGB model as it is based on a subtractive color mixing principle. In the CMYK model, colors are created by partially or completely masking certain colors on a lighter background, typically white. The subtractive primary colors—cyan, magenta, and yellow—are utilized to absorb light, thereby reducing the amount of light that is reflected. The 'K' component, which usually represents black ink, is added to provide depth and detail to the image. For instance, the combination CMYK(0, 100, 100, 0) produces the color red by using maximum levels of magenta and yellow, while keeping cyan and black at zero. This method of color mixing is particularly effective in print media.
The CMYK model is crucial in the printing industry as it accurately replicates how physical inks blend to produce a wide range of colors. When inks of cyan, magenta, yellow, and black are printed onto paper, they absorb specific wavelengths of light, thereby subtracting colors from white light to create the desired hue. This process is essential for achieving the desired color output in printed materials. The CMYK color model's ability to produce a broad spectrum of colors through subtractive mixing makes it the standard for any kind of color printing. It ensures that the colors in printed materials, such as magazines, brochures, and packaging, are both vibrant and precise, closely matching the original design intent.
The Pantone Color System is a globally recognized standardized color reproduction system, widely used in various industries such as printing, manufacturing, and design. It functions as a universal language for color identification and communication, offering a consistent and reliable way to reference colors. Each color in the Pantone system is given a unique number and name, which ensures uniformity and precision in color reproduction across different applications and mediums. For example, 'Pantone 185 C' is a designated identifier for a specific shade of red, known and used consistently worldwide under this name. This precise system of color identification allows for exact replication and communication of colors in various contexts.
Pantone's extensive range of colors, including unique shades like metallics and fluorescents, sets it apart from standard color models like CMYK, which have limitations in reproducing such hues. This expansive palette is essential in areas like brand identity and product design, where specific and unique colors are often used to establish a brand’s visual identity. Pantone’s system plays a crucial role in ensuring color consistency, especially in complex production workflows where different materials and processes can lead to color variations. By providing a standardized color reference, Pantone helps maintain color fidelity throughout the design and production stages, ensuring that the final product matches the original design intent. This reliability and universality make the Pantone Color System an indispensable tool in the world of color management and design.
The time to acquire a target is a function of the distance to and size of the target.
Fitts's Law is a predictive model of human movement, primarily concerning pointing actions, either physically or virtually. It states that the time required to move to a target area is a function of the ratio between the distance to the target and the target's width. In essence, targets that are smaller and further away take longer to acquire than those that are larger and closer. This principle is fundamental in human-computer interaction and greatly influences user interface design.
T = a + b log2(2D / W)
where T is the average time taken to complete the movement, D is the distance from the starting point to the center of the target, W is the width of the target, and a and b are constants that can be determined empirically for a particular setting.
For example, this formula can be used to calculate the time it might take to move a cursor to a button of a certain size on a computer screen, based on the distance of the cursor from the button and the size of the button.
Paul Fitts, in 1954, developed Fitts's Law. Originally, it was about modeling the action of pointing. Today, it's a key concept in ergonomics, human-computer interaction, and design. Fitts's Law helps us understand how design choices affect user efficiency and comfort. It's based on the idea that people can move more quickly to larger and closer targets. This understanding has been applied in various fields, from designing airplane cockpits to creating user-friendly software interfaces. The law has evolved over time, integrating insights from psychology, engineering, and design to remain relevant in our technology-driven world.
The design of an object should suggest how it is to be used.
Affordance Theory highlights the importance of designing objects in a way that their intended use is immediately apparent to users. Coined by psychologist James J. Gibson, this principle emphasizes that the physical and visual properties of an object should communicate its function. In design, it helps ensure usability by leveraging intuitive understanding rather than relying on instructions.
James J. Gibson introduced the concept of affordances in the context of ecological psychology in the late 1970s. He described affordances as the potential actions an environment or object offers to a user. Later, Donald Norman adapted this idea to design, emphasizing its relevance to user-centered design and human-computer interaction (HCI). Norman highlighted that good design incorporates clear affordances to ensure users understand how to interact with objects and systems without confusion.
There is a certain amount of complexity in a system which cannot be reduced.
Tesler’s Law highlights that every system, process, or interface comes with an inherent level of complexity. This complexity cannot be entirely eliminated; it must be managed or shifted. While designers aim to simplify interfaces, some of this complexity will always exist and has to be handled by either the system or the user. The principle emphasizes the balance between simplifying user experience and preserving essential system functionality.
Larry Tesler, a pioneer in human-computer interaction, proposed this law. Tesler’s work at companies like Apple and Xerox PARC was driven by the goal of simplifying technology while respecting its inherent complexities. His law reminds designers and developers that while interfaces can be made more intuitive, the system’s underlying complexity must be thoughtfully allocated, ensuring users are not overwhelmed while maintaining essential functionality.
Aesthetic products are perceived as easier to use.
The Aesthetic-Usability Effect suggests that users often perceive visually appealing designs as more user-friendly, even if the underlying functionality remains unchanged. This principle highlights the psychological impact of aesthetics on usability, emphasizing that well-designed, visually pleasing interfaces enhance user satisfaction and perceived ease of use.
The Aesthetic-Usability Effect was introduced by researchers Masaaki Kurosu and Kaori Kashimura in 1995 through their study on ATM interfaces. They discovered that visually pleasing interfaces were rated as more usable, even when functionality was identical. This principle has since been integral in fields like user experience (UX) design, cognitive psychology, and product design. It underscores the role of emotional responses in shaping perceptions of usability, reinforcing the importance of combining beauty with practicality.
Productivity soars when a computer and its users interact at a pace (< 400 ms) that ensures that neither has to wait on the other.
The Doherty Threshold emphasizes that efficient interaction between users and computers leads to enhanced productivity. When the system responds within 400 milliseconds, users remain engaged and maintain their workflow without experiencing interruptions or frustration. This principle highlights the importance of creating systems where neither party—human or machine—is left idle.
The Doherty Threshold was formulated by Walter J. Doherty and Arvind J. Thadani in the early 1980s. Their research highlighted the psychological effects of system responsiveness on user engagement. They discovered that keeping interactions under the 400ms threshold aligns with human cognitive rhythms, ensuring users remain focused and productive. This principle has since been widely adopted in design, especially in human-computer interaction and web development, to create smoother and more efficient user experiences.
People hesitate to quit a strategy due to past investments, even if future benefits don't justify ongoing commitment.
The Sunk Cost Fallacy is a cognitive bias that makes people to continue with an investment or project due to the substantial resources they've already committed, regardless of future costs or benefits. This fallacy overlooks the principle that past costs are irrelevant to current decisions; only future costs and benefits should be considered. It's a widespread phenomenon affecting personal decisions, business strategies, and even governmental policies, highlighting the psychological difficulty of "cutting losses."
The term "sunk cost" comes from economics and refers to costs that have already been incurred and cannot be recovered. The fallacy aspect, recognizing the irrationality of letting sunk costs influence current decisions, has been discussed in behavioral economics and psychology. It's closely linked with concepts like loss aversion and commitment, studied by psychologists and economists like Daniel Kahneman and Amos Tversky, who explored how irrational biases affect economic and personal decisions.
80% of the effects come from 20% of the causes.
The Pareto Principle, also known as the 80/20 Rule, highlights the uneven distribution of effects compared to their causes. In design and productivity, this principle suggests that a small fraction of inputs, features, or actions typically generate the majority of results or value. Recognizing this can help prioritize efforts and resources, ensuring focus on the elements that drive the most impact. It is widely applicable in business, design, and everyday problem-solving.
The Pareto Principle is named after Italian economist Vilfredo Pareto, who, in 1896, observed that 80% of Italy’s land was owned by 20% of the population. The principle was later generalized and applied to a variety of domains by Joseph M. Juran in the mid-20th century. Today, it is a cornerstone of efficiency-focused methodologies such as Lean, Agile, and Six Sigma, helping designers, managers, and strategists make impactful decisions with limited resources.
The tendency to approach a goal increases with proximity to the goal.
The Goal Gradient Effect describes a psychological phenomenon where motivation to complete a task intensifies as one gets closer to finishing it. People are more likely to put in greater effort and focus as they near the endpoint of their goal. This concept is rooted in behavioral psychology and has significant applications in areas like marketing, gamification, and user experience design.
The effect suggests that perceived progress boosts engagement and perseverance, making individuals more likely to push through challenges as they see the finish line approaching.
Effort = f(1/Distance to Goal)
Where effort increases non-linearly as the remaining distance decreases, often peaking just before the goal is achieved.
The Goal Gradient Effect is often visualized through effort curves, showing that effort or motivation intensifies as distance to the goal decreases.
The concept was first introduced by behaviorist Clark L. Hull in 1932. Hull’s studies on rats in mazes demonstrated that rats ran faster as they neared the food at the end of the maze. This foundational work has since been adapted and applied to human behavior, particularly in the realms of motivation, consumer psychology, and productivity.
The Goal Gradient Effect underscores the importance of designing systems that visually or psychologically emphasize progress, leveraging our innate tendency to focus and work harder as the finish line approaches.
Work expands to fill the time available for its completion.
Parkinson’s Law is an observation about human behavior, highlighting the tendency for tasks to stretch out and consume all the time allocated to them, regardless of their actual complexity or requirements. This principle suggests that the more time you give yourself to complete a task, the longer you’ll likely take—even if the task could have been done in a fraction of that time. Parkinson’s Law has implications for productivity, time management, and efficiency in both personal and professional contexts.
Parkinson’s Law was first articulated by Cyril Northcote Parkinson, a British naval historian and author, in a 1955 essay published in The Economist. The essay humorously described the inefficiencies of bureaucratic systems, noting how administrative tasks and staff levels grow without regard to actual workload. Over time, the concept has been broadly applied to time management and productivity studies, illustrating the universal tendency to let tasks consume available resources. Despite its satirical origins, Parkinson’s Law has become a widely recognized principle with practical implications for optimizing work and managing time effectively.
A phenomenon where people are more likely to complete a goal if they have a head start.
The Endowed Progress Effect is a psychological principle describing how individuals are more motivated to complete tasks when they perceive themselves as already having made progress toward the goal. This effect works by creating a sense of momentum or investment in the goal. By providing a head start, the goal feels more attainable, and people are more likely to commit to completing it. This principle is commonly applied in behavioral economics, marketing strategies, and user experience design to encourage task completion or participation.
The Endowed Progress Effect operates on the principle of goal gradient theory, which suggests that individuals increase their efforts as they perceive themselves to be closer to completing a goal. By giving a head start, designers can exploit this natural tendency to boost motivation and engagement.
The Endowed Progress Effect was formalized in a study by Nunes and Drèze in 2006. In their research, they demonstrated that providing participants with a small head start increased completion rates in goal-oriented activities. This concept has since been integrated into various domains, including user interface design, gamification, and loyalty program structures. By leveraging our innate desire for completion and progress, this effect underscores how subtle design elements can have a profound impact on user behavior.
Users weigh the cost of their action (effort) against the benefits they will receive.
The Cost-Benefit Principle explains user behavior by highlighting the balance between effort and reward. It suggests that users will be motivated to perform an action only when the perceived benefits outweigh the costs involved, whether in terms of time, physical effort, mental strain, or emotional investment. In design, this principle underlines the importance of minimizing friction (cost) and maximizing value (benefit) to encourage desired user actions.
The Cost-Benefit Principle originates from economic theory but has been widely adopted in behavioral science, psychology, and design. Its roots lie in the Rational Choice Theory, which assumes individuals act in their own best interest by evaluating the trade-offs of their decisions. In the design context, this principle has been adapted to understand and influence user behavior, promoting efficiency and satisfaction. By balancing effort and reward, the principle helps designers create experiences that users find engaging, intuitive, and valuable.
Items in short supply are perceived as more valuable.
The Scarcity Principle is a psychological phenomenon that explains how perceived scarcity increases the value or desirability of an item. When resources, products, or opportunities are limited, they seem more appealing to people, who interpret their rarity as a signal of high value or exclusivity. This principle often influences decision-making, as individuals are motivated to acquire scarce items before they become unavailable. It is widely used in marketing, behavioral economics, and user experience design to drive engagement and purchases.
The Scarcity Principle is rooted in loss aversion and reactance theory:
The Scarcity Principle is deeply tied to principles of behavioral psychology and has been studied extensively in economics and marketing. It gained prominence through the work of Robert Cialdini, whose 1984 book Influence: The Psychology of Persuasion explored scarcity as one of six key principles of persuasion. The principle has since become foundational in various fields, including advertising, e-commerce, and UX design, shaping how products and services are marketed to capitalize on human behavior.
Among competing hypotheses that predict equally well, the one with the fewest assumptions should be selected.
Occam's Razor is a principle in logic and problem-solving that motivates for simplicity in explanations. It suggests that when multiple hypotheses are available, the one with the fewest assumptions is preferable. This concept is vital in scientific modeling and philosophy, emphasizing a philosophical and practical approach to problem-solving that values simplicity and directness.
The principle is linked to the 14th-century thinker and Franciscan friar William of Ockham, though it existed before him. Versions of this idea can be found in the works of Aristotle and other ancient philosophers. William of Ockham didn't create this idea but used it often and noticeably, which is why it's named after him. As time went on, this principle became an essential part of modern scientific methods, focusing on the idea that hypotheses should be kept as simple as possible.