TOPIC

150+ Laws of Software & Technology

The field of software and technology is constantly evolving, and with that evolution comes a vast array of laws and regulations that govern how these technologies are developed, used, and protected. From intellectual property laws to data privacy regulations, there are over 150 laws and regulations that are relevant to software and technology. This comprehensive list is the most complete compilation available on the web, offering a valuable resource for developers, tech companies, and individuals who want to ensure that they are complying with all relevant laws and regulations in this rapidly changing field. With this list, you can stay up-to-date with the latest legal developments in the world of software and technology, ensuring that you are always operating in compliance with the law.

start READING

Knuth’s Optimization Principle (manage projects wisely)

Premature optimization is the root of all evil.


The Pattis Zen

When debugging, novices insert corrective code, experts remove defective code.


Lubarsky’s Law (Testing, Quality)

There is always one more bug.


The Ninety Ninety Rule (manage projects wisely)

The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.


Laws of Software Estimates (manage projects wisely)

Estimates are five things: waste, non-transferable, wrong, temporary, and necessary.


Wirth’s Law (predicting future)

Software gets slower more quickly than hardware gets faster.


Zawinski’s Law (predicting future)

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones that can.


Agile Peculiarity (manage projects wisely)

There’s always time to make more changes until there’s no more time to make changes. It’s always the last change that blew it up. Alan-Forced CongruencyIt is easier to change the specification to fit the program than vice versa.


Amdahl’s Law (be mindful about tech)

The speedup gained from running a program on a parallel computer is greatly limited by the fraction of that program that can’t be parallelized.


Archimedean Principle (Architecture)

A software system built on top of a weak architecture will sink due to the weight of its own success.


Atwood’s Law (Code)

Any software that can be written in JavaScript will eventually be written in JavaScript.


Boris Lemma (Testing)

Bugs lurk in corners and congregate at boundaries.


Bruce Transmutation

Any sufficiently advanced bug is indistinguishable from a feature.


Conway’s Law (Org & Team)

Any piece of software reflects the organizational structure that produces it. In other words, organizations that design systems (of all sorts) are constrained to produce designs that are copies of the communication structures of these organizations.


Davis’ Law

A demo of a system, application, or anything with respect to some view or perspective, is termed as of high importance depending upon the view with which it was built.


Dijkstra Observation

If debugging is the process of removing software bugs, then programming must be the process of putting them in.


Doherty Threshold

Productivity soars when a computer and its users interact at a pace (<400ms) that ensures that neither has to wait on the other.


DRY 

Don’t repeat yourself. A principle of software development aimed at reducing repetition of software patterns, replacing it with abstractions, or using data normalization to avoid redundancy.


Eagleson’s Law

Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.


First Law of Code Documentation

No comments.


Fion’s Axiom

There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs.


Fundamental Limit of Requirements

Requirements end where the liberty of the developers begins.


Gilb’s Laws of Unreliability Programming

Computers are unreliable, but humans are even more unreliable.

Any system which depends on human reliability is unreliable.

The only difference between the fool and the criminal who attacks a system is that the fool attacks unpredictably and on a broader front.

A system tends to grow in terms of complexity rather than simplification until the resulting unreliability becomes intolerable.

Self-checking systems tend to have a complexity in proportion to the inherent unreliability of the system in which they are used.

The error-detection and correction capabilities of any system will serve as the key to understanding the type of errors that they cannot handle.

Undetectable errors are infinite in variety, in contrast to detectable errors, which by definition are limited.

All real programs contain errors until proven otherwise — which is impossible.

Investment in reliability will increase until it exceeds the probable cost of errors, or somebody insists on getting some useful work done.


Gray Dichotomy

Structural abstraction (not a performance problem) can always be solved by introducing a level of indirection.


Greenspun’s Tenth Rule of Programming

Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of Common Lis.


Heisenberg Requirement

The more stable a requirement is considered, the greater the probability it is changed.


Heisenbug Uncertainty Principle

Most production software bugs are soft: they go away when you look at them.

Hoare Duality

There are two ways of building a piece of software: one is to make it to simple that there are obviously no errors. The other is to make it so complicated that there are no obvious errors.

Hoare’s Law of Large Programs

Inside every large program is a small program struggling to get out.

Hofstadter’s Law

A task always takes longer than you expect, even when you take into account Hofstadter’s Law.

Hyrum’s Law

With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody.

Kaner Non-Symmetry

A program that perfectly meets a lousy specification is a lousy program.

Kerckhoffs’s Principles

The system must be practical, if not mathematically, indecipherable.

It must not be required to be secret, and it must be able to fall into the hands of the enemy without inconvenience. 

Its key must be communicable and retainable without the help of written notes, and changeable or modifiable at the will of the correspondents. 

It must be applicable to telegraphic correspondence. 

Apparatus and documents must be portable, and their usage and function must not require the concourse of several people. 

Finally, it is necessary, given the circumstances that command its application, that the system be easy to use, requiring neither mental strain nor the knowledge of a long series of rules to observe.

Lehman’s Laws of Software Evolution

An E-type system must be continually adapted or it becomes progressively less satisfactory.

As an E-type system evolves, its complexity increases unless work is done to maintain or reduce it.

E-type system evolution processes are self-regulating with the distribution of product and process measures close to normal.

The average effective global activity rate in an evolving E-type system is invariant over the product’s lifetime.

As an E-type system evolves, all associated with it, developers, sales personnel, and users, for example, must maintain mastery of its content and behavior to achieve satisfactory evolution. Excessive growth diminishes that mastery. Hence the average incremental growth remains invariant as the system evolves. 

The functional content of an E-type system must be continually increased to maintain user satisfaction over its lifetime. 

The quality of an E-type system will appear to be declining unless it is rigorously maintained and adapted to operational environment changes.

E-type evolution processes constitute multi-level, multi-loop, multi-agent feedback systems and must be treated as such to achieve significant improvement over any reasonable base.

Linus’s Law

Given enough eyeballs, all bugs are shallow.

Michael Solution

If you automate a mess, you get an automated mess.

Nathan’s First Law

Software is like gas; it expands to fill its container.

Pesticide Paradox

If the same tests are repeated over and over again, eventually the same test cases will no longer find new bugs.

Wegner’s Lemma

It is impossible to fully specify or test an interactive system designed to respond to external inputs.

Ziv’s Law

Software development is unpredictable and that documented artifacts such as specifications and requirements will never be fully understood.

KISS

Keep it simple and stupid.

Liskov Substitution Principle

Functions that use pointers to base classes must be able to use objects of derived classes without knowing it.

YAGNIY

You aren’t gonna need it. A programmer should not add functionality until deemed necessary.

Kelly’s Law

Software scope will always increase in proportion to resources.

Kernighan’s Law

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.

Sixty-sixty Rule

Sixty percent of software’s dollar is spent on maintenance, and sixty percent of that maintenance is enhancement.

Spector’s Law

The time it takes your favorite application to complete a given task doubles with each new revision.

Yao’s Principle

The expected cost of a randomized algorithm on the worst-case input is no better than the expected cost for a worst-case probability distribution on the inputs of the deterministic algorithm that performs best against that distribution.

Bergman Dilation (Management)

There’s never enough time to do it right, but there’s always enough time to do it over.

Brooks’s Law

Adding manpower to a late software project makes it later.

Weinberg-Brooks’ Law

More Software projects have gone awry from management’s taking action based on incorrect system models than from all other causes combined.

Chekhov’s Gun

Remove everything that has no relevance to the story.

Ellison’s Law of Data

Once the business data have been centralized and integrated, the value of the database is greater than the sum of the preexisting parts.

Gustafson’s Law 

Any sufficiently large problem can be efficiently parallelized. 

Mencken Razor

For every complex problem, there is a solution that is simple, neat, and wrong.

Redundancy Conundrum (Testing, Specifications, Architecture)

Redundancy is a major source of errors, though it can also be used to reveal them.

Augustine’s 1st Law (Specifications, Architecture)

It is true complex systems may be expensive, but it must be remembered that they don’t contribute much. Short version, it costs a lot to build bad products. 

Augustine’s 2ndLaw (Management, Time)

Any task can be completed in only one-third more time than is currently estimated.

Simple systems are not feasible because they require infinite testing.

Hardware works best when it matters the least.

Augustine’s 16th Law (Fundamentals)

Software is like entropy. It is difficult to grasp, weighs nothing, and obeys the Second Law of Thermodynamics: i.e., it always increases.

Humphrey’s 1st Law (Management)

Conscious attention to a task normally performed automatically can impair its performance.

Humphrey’s 2nd Law (Specifications)

For a new software system, the requirements will not be completely known until after the users have used it.

Goodhart’s Law (Management)

When a measure becomes a target, it ceases to be a good measure.

Parkinson’s Law (Management)

Work expands so as to fill the time available for its completion.

Three F’s of Priority Management (Management)

Functionality, Fidelity, Efficiency.

Dijkstra’s Law (Specifications)

Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it. And to make matters worse: complexity sells better.

Dude’s Law (Vision)

If you don’t have a good reason for the project, it doesn’t matter how well you do it.

Gall’s Law (Architecture)

A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.

Glass’s Law (Specification)

Requirement deficiencies are the prime source of project failures

Hartree’s Law (Management)

Whatever the state of a project, the time a project leader will estimate for completion is constant.

The Second Zeno Paradox (Specification)

What remains to be done is not enough to satisfy the customer. Customer satisfaction is a moving target.

Barnes Law (Management)

Cost, time, and quality, choose two.

Hanlon’s Razor (Team)

Do not attribute to malice that which is more easily explained by stupidity.

IBM Pollyanna Principle (Vision)

Machines should work. People should think.

Papert’s Principle (Learn)

Some of the most crucial steps in mental growth are based not simply on acquiring new skills, but on acquiring new administrative ways to use what one already knows.

Parkinson’s Law of Triviality (Team & Org)

Members of an organization give disproportionate weight to trivial issues.

Cornuelle’s Law (Team & Org)

Authority tends to assign jobs to those least able to do them.

Dilbert Principle (Team)

Incompetent employees are promoted to management positions to get them out of the workflow.

Peter Principle (Team)

People in a hierarchy tend to rise to their level of incompetence. 

Courtois’s Rule (Team)

If people listened to themselves more often, they’d talk less.

Frisch’s Law (Team)

You cannot have a baby in one month by getting nine women pregnant.

Joy’s Law (Team & Org)

No matter who you are, most of the smartest people work for someone else.

Sayre’s Law (Team)

In any dispute the intensity of feeling is inversely proportional to the value of the issues at stake.

Sattinger’s Law (Humor)

It works better if you plug it in.

Shirky Principle (Team & Org)

Institutions will try to preserve the problem to which they are the solution.

Sutton’s Law (Testing)

When diagnosing, one should first consider the obvious.

First Principles (Fundamentals)

Define the base principles to reason more clearly. First-principles thinking is one of the best ways to reverse-engineer complicated situations and unleash creative possibility. Sometimes called reasoning from first principles, it’s a tool to help clarify complicated problems by separating the underlying ideas or facts from any assumptions based on them. What remains are the essentials. If you know the first principles of something, you can build the rest of your knowledge around them to produce something new.

Law of Argumentative Comprehension (Team)

The more people understand something, the more willing they are to argue about it, and the more vigorously they will do so.

Law of the Instrument (Skills)

If all you have is a hammer, everything looks like a nail.

Lister’s Law (Team, Management)

People under time pressure don’t think faster.

Sturgeon’s Law (Fundamentals)

Ninety percent of everything is crap.

Murphy’s Law (Fundamentals)

If anything can go wrong, it will, and at the most inopportune time.

Sod’s Law (Fundamentals)

If something can go wrong, it will.

Schneier’s Law (Testing)

Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break.

Rosenthal Effect / Pygmalion Effect (Team)

A general characteristic of human nature is that people tend to judge themselves, especially their competence and worth, based on the perception of others.

Rothbard’s Law (Team)

People tend to specialise in what they’re worst at.

Algorithms (System Thinking, Fundamentals)

While hard to precisely define, an algorithm is generally an automated set of rules or a “blueprint” leading a series of steps or actions resulting in a desired outcome, and often stated in the form of a series of “If → Then” statements. Algorithms are best known for their use in modern computing, but are a feature of biological life as well. For example, human DNA contains an algorithm for building a human being.

Bottleneck (System Thinking, Fundamentals)

A bottleneck describes the place at which a flow (of a tangible or intangible) is stopped, thus constraining it back from continuous movement. As with a clogged artery or a blocked drain, a bottleneck in production of any good or service can be small but have a disproportionate impact if it is in the critical path. However, bottlenecks can also be a source of inspiration as they force us reconsider if there are alternate pathways to success.

Churn (System Thinking, Fundamentals)

Insurance companies and subscription services are well aware of the concept of churn — every year, a certain number of customers are lost and must be replaced. Standing still is the equivalent of losing, as seen in the model called the “Red Queen Effect.” Churn is present in many business and human systems: A constant figure is periodically lost and must be replaced before any new figures are added over the top.

Critical Mass (System Thinking, Fundamentals)

A system becomes critical when it is about to jump discretely from one phase to another. The marginal utility of the last unit before the phase change is wildly higher than any unit before it. A frequently cited example is water turning from a liquid to a vapor when heated to a specific temperature. “Critical mass” refers to the mass needed to have the critical event occur, most commonly in a nuclear system.

Emergence (System Thinking, Fundamentals)

Higher-level behavior tends to emerge from the interaction of lower-order components. The result is frequently not linear — not a matter of simple addition — but rather non-linear, or exponential. An important resulting property of emergent behavior is that it cannot be predicted from simply studying the component parts.

Equilibrium (System Thinking, Fundamentals)

Homeostasis is the process through which systems self-regulate to maintain an equilibrium state that enables them to function in a changing environment. Most of the time, they over or undershoot it by a little and must keep adjusting. Like a pilot flying a plane, the system is off course more often than on course. Everything within a homeostatic system contributes to keeping it within a range of equilibrium, so it is important to understand the limits of the range.

Feedback Loops (System Thinking, Fundamentals)

All complex systems are subject to positive and negative feedback loops whereby A causes B, which in turn influences A (and C), and so on — with higher-order effects frequently resulting from continual movement of the loop. In a homeostatic system, a change in A is often brought back into line by an opposite change in B to maintain the balance of the system, as with the temperature of the human body or the behavior of an organizational culture. Automatic feedback loops maintain a “static” environment unless and until an outside force changes the loop. A “runaway feedback loop” describes a situation in which the output of a reaction becomes its own catalyst (auto-catalysis).

Principle of Irreducibility (System Thinking, Fundamentals)

We find that in most systems there are irreducible quantitative properties, such as complexity, minimums, time, and length. Below the irreducible level, the desired result simply does not occur. One cannot get several women pregnant to reduce the amount of time needed to have one child, and one cannot reduce a successfully built automobile to a single part. These results are, to a defined point, irreducible.

Law of Diminishing Returns (System Thinking, Fundamentals)

Related to scale, most important real-world results are subject to an eventual decrease of incremental value. A good example would be a poor family: Give them enough money to thrive, and they are no longer poor. But after a certain point, additional money will not improve their lot; there is a clear diminishing return of additional dollars at some roughly quantifiable point. Often, the law of diminishing returns veers into negative territory — i.e., receiving too much money could destroy the poor family.

Margin of Safety (System Thinking, Fundamentals)

Similarly, engineers have also developed the habit of adding a margin for error into all calculations. In an unknown world, driving a 9,500-pound bus over a bridge built to hold precisely 9,600 pounds is rarely seen as intelligent. Thus, on the whole, few modern bridges ever fail. In practical life outside of physical engineering, we can often profitably give ourselves margins as robust as the bridge system.

Principle of Scale (System Thinking, Fundamentals)

One of the most important principles of systems is that they are sensitive to scale. Properties (or behaviors) tend to change when you scale them up or down. In studying complex systems, we must always be roughly quantifying — in orders of magnitude, at least — the scale at which we are observing, analyzing, or predicting the system.

Human-Machine Polarisation Principle (be mindful about tech)

Artificial intelligence is always better than human stupidity. Computers can never replace human stupidity.

Amara’s Law (predicting future)

We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.

Andy and Bill’s Law (predicting future)

When a computer chip is released, new software will be released to use up all of its power.

Asimov’s 1st Law of Robotics (be mindful about tech)

A robot may not injure a human being or, through inaction, allow a human being to come to harm. 

Asimov’s 2nd Law of Robotics (be mindful about tech)

A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

Asimov’s 3rd Law of Robotics (be mindful about tech)

A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Bell’s Law (predicting future)

Every decade a new computing platforms forms, evolves and build an independent new industry.

Maes–Garreau Law (predicting future)

Most favorable predictions about future technology will fall around latest possible date they can come true and still remain in the lifetime of the person making the prediction.

Norvig’s Law (predicting future)

Any technology that surpasses 50% penetration will never double again (in any number of months).

Clarke’s 3rd Law (be mindful about tech)

Any sufficiently advanced technology is indistinguishable from magic.

Clarke’s 1st Law (predicting future)

When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

Ellison’s Law of Cryptography and Usability

The user base for strong cryptography declines by half with every additional keystroke or mouse click required to make it work.

Moore’s Law (predicting future)

Processing power for computers will double every two years.

Gilder’s Law (predicting future)

Bandwidth grows at least three times faster than computer power.

Grosch’s Law (predicting future)

Computer performance increases as the square of the cost.

Jevons Paradox (predicting future)

Technological progress increases the efficiency with which a resource is used (reducing the amount necessary for any one use), but the rate of consumption of that resource rises due to increasing demand.

Melvin Kranzberg’s 1st law (be mindful about tech)

Technology is neither good nor bad; nor is it neutral. 

Melvin Kranzberg’s 2nd law (be mindful about tech)

Invention is the mother of necessity.

Melvin Kranzberg’s 3rd law (be mindful about tech)

Technology comes in packages, big and small.

Melvin Kranzberg’s 4th law (be mindful about tech)

Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.

Melvin Kranzberg’s 5th law (be mindful about tech)

All history is relevant, but the history of technology is the most relevant.

Melvin Kranzberg’s 6th law (be mindful about tech)

Technology is a very human activity — and so is the history of technology.

Zimmermann’s Law (be mindful about tech)

The natural flow of technology tends to move in the direction of making surveillance easier, and the ability of computers to track us doubles every eighteen months.

1–9–90 Rule (Architecture, Specification)

In a collaborative user-generated content website, 90% of the users only view content, 9% of the participant's edit content, and 1% of the participants actively create content.

Putt’s Law (Team & Org)

Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand.

Spafford’s Adoption Rule (be mindful about tech)

For just about any technology, be it an operating system, application, or network, when a sufficient level of adoption is reached, that technology then becomes a threat vector.

Antoine de Saint-Exupéry (Specification, Architecture)

Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away.

Norman Strange Attractor (Specification, Architecture)

The hardest part of design…is keeping features out.

Tesler’s Law (Specification, Architecture)

Every application has an inherent amount of complexity that cannot be removed or hidden. Instead, it must be dealt with, either in product development or in user interaction.

The Adams Pitfall (Specification, Architecture)

A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools.

Yerkes–Dodson Law (Team)

Elevated arousal levels can improve performance up to a certain point

Pareto Principle (Fundamentals)

The Pareto principle states that, for many events, roughly 80% of the effects come from 20% of the causes.

Occam’s Razor (Fundamentals)

Among competing hypotheses that predict equally well, the one with the fewest assumptions should be selected.

Menu

Menu