Interface Friction as Type Safety
Fellow Python programmers, I have traveled far and wide. If you will be Kublai Khan to my Marco Polo, I can tell of strange and distant lands. I have seen camelCase variables pass through eyeOfNeedle. I've seen AbstractFactories in clouds that make concrete. And I've seen strange creatures who speak with strange rules called "types". In Rust's land of crustaceans, all who speak of these types carelessly are stricken down.
fn double_an_integer(x: i64) -> i64 { return x * 2; } fn main() { double_an_integer(4.0); }
error[E0308]: mismatched types --> src/main.rs:6:25 | 6 | double_an_integer(4.0); | ----------------- ^^^ expected `i64`, found floating-point number | | | arguments to this function are incorrect |
Of course, Python also has types, but it barely cares for them. You can ask for type information with type()
or isinstance()
, but no-one will mind if you don't. Python is build around "duck typing". If it walks like a duck and quacks like a duck, it's a duck, and the question is what you can do, not what you are. Metaphysics are lame.
def double_something(x): return x * 2 double_something(1) # 2 double_something(1.25) # 2.5 double_something("Hello") # HelloHello double_something(None) # TypeError: unsupported operand type(s) for *: 'NoneType' and 'int'
This is our language's ultimate double-edged sword. On one hand, we never know what we're working with. Our house is built on sand, and any function call could screw everything up. On the other hand, it gives us options we never thought to ask for:
def is_there_overlap(iterable_a, iterable_b): for element in iterable_a: if element in iterable_b: return True return False is_there_overlap([1,2,3], [4,5,6]) # False is_there_overlap([1,2,3], (2,3,4)) # True is_there_overlap({'a', 'b'}, 'I like apples') # True
[1,2,3]
is a mutable list. (2,3,4)
is an immutable tuple. {'a', 'b'}
is an unordered set with unique elements and constant-time lookup. 'I like apples'
is a UTF-8 encoded string. These inputs differ in myriad ways, but none are important for our use case. We can mix and match these data types and everything works.
If we knew for a fact we were working with sets, we could simplify this to:
def is_overlap_for_sets_only(set_a, set_b): return not set_a.isdisjoint(set_b)
But if we don't know or care what we're working with, we get flexibility for free. Python will only complain if we're asking for something impossible, like if we need mutability but provide an immutable tuple.[1] Sometimes, this is great. But sometimes, we want to lock down exactly what something can work with, and sometimes we want to go beyond the types Python naturally provides. That's when we feel the sharp edge of the sword.
These days, we can add type annotations, which Python will happily and dutifully ignore.
def double_an_integer(x: int) -> int: return x * 2 definitely_an_integer: int = 5 definitely_an_integer = "Whoops!" result = double_an_integer(definitely_an_integer) print(result) # "Whoops!Whoops!"
Python's types are basically comments. This is still useful; it's a way to precisely state our intent, and third party tools can check if we're consistent. But this still puts the onus on us to be diligent.
A Motivating Example
At work, I write code to interface with Google Ads, mostly to update two settings. The first is Budget: how much we can spend on a given campaign. The second is ROAS, or Return On Ad Spend. 100% ROAS means we want to break even; 125% ROAS means that if we spend $10, we want to make $12.50 in revenue, or $2.50 in profit.[2]
Alas, Google has set a trap, and that trap is unit conversion. Google's API thinks of money in "micros", or millionth of a dollar, and expects to see $50 as 50000000
.[3] Google also provides a bulk editor, but it thinks in dollars, so it expects 50
. The API thinks a 110% ROAS is 1.1
. In the bulk editor, it's 110
. If you mess up and tell the bulk editor your ROAS is 1.1
, that means you're fine losing 98.9% of the money you spend. And if you accidentally set your budget to $50,000,000 , you've told Google it's fine to lose $49,450,000 on this campaign. Your boss may disagree.
How can we be confident we're in the right units? If we mess up, the company's bankrupt. We don't want every function to have to check its range. We could write some validation code and call it, and make sure the code is idempotent, like only changing the value if it's out of range. But then again, we have to remember to call it. We want to take in some input, convert it to an internal format, use that internal format everywhere we can, and then convert once and only once at the end.
Dataclasses
One strategy is to create a custom data type which encodes these constraints. We could create a custom class with an __init__()
function to enforce this, or use a dataclass and get things like comparison checks by default.
from dataclasses import dataclass @dataclass class ConstrainedBudget: value: int def __post_init__(self): if not 1 <= self.value <= 1000: raise ValueError("Budget out of range") budget = ConstrainedBudget(5) # Succeeds budget = ConstrainedBudget(5000) # ValueError!
However, this can still get us in trouble:
budget = ConstrainedBudget(5) # Succeeds budget.value = 100000000000 print(budget.value) # 100000000000, whoops!
So we need to make sure these can values are only set once:
from dataclasses import dataclass @dataclass(frozen=True) class ConstrainedBudget: value: int def __post_init__(self): if not 1 <= self.value <= 1000: raise ValueError("Budget out of range") budget = ConstrainedBudget(5) print(budget.value) budget.value = 10000000000 # dataclasses.FrozenInstanceError: cannot assign to field 'value'
This is a good start, but there's another risk. What if we have multiple of these constrained values floating around?
@dataclass(frozen=True) class ConstrainedBudget: value: float def __post_init__(self): if not 1 <= self.value <= 1000: raise ValueError("Budget out of range") @dataclass(frozen=True) class ConstrainedROAS: value: float def __post_init__(self): if not 0.1 <= self.value <= 2.0: raise ValueError("ROAS out of range") def set_troas(constrained_roas: ConstrainedROAS): print(constrained_roas.value) pretty_sure_this_is_a_roas = ConstrainedBudget(110) set_troas(pretty_sure_this_is_a_roas) # This "works", which is probably not we want!
Whoops! Now it's once again extremely easy to mess this up. The trick here is that the interface IS the type. Those type annotations are a lie. These might look like different values, but if we can use them in the same way, they might as well be the same thing. That's the whole promise of duck typing, and our ducks have escaped.
Pydantic
What if we use a third party tools like Pydantic, which does care about types?
from pydantic import BaseModel, Field class LargeNumber(BaseModel): value: float = Field(ge=100) class SmallNumber(BaseModel): value: float = Field(ge=0, le=1) def process_small_number(small_num: SmallNumber): print("This number must be pretty small, right?") print(small_num.value) assert small_num.value < 100
Like the dataclass, this will check the range of values on creation:
small = SmallNumber(value=0.9) # Succeeds large = LargeNumber(value=1) # Fails
pydantic_core._pydantic_core.ValidationError: 1 validation error for LargeNumber value Input should be greater than or equal to 100 [type=greater_than_equal, input_value=1.1, input_type=float]
But like before, we can get around this:[4]
small = SmallNumber(value=0.9) small.value = 100000000000 print(small)
And Pydantic still doesn't enforce that we're using this right.
large = LargeNumber(value=0.9) process_small_number(large) # This number must be pretty small, right? 1e+17
We need a decorator to validate that we're using these types the right way:
from pydantic import validate_call @validate_call def process_small_number(small_num: SmallNumber): print("This number must be pretty small, right?") print(small_num.value) not_a_small_number = LargeNumber(value=100000000000000000) process_small_number(not_a_small_number)
pydantic_core._pydantic_core.ValidationError: 1 validation error for process_small_number 0 Input should be a valid dictionary or instance of SmallNumber [type=model_type, input_value=LargeNumber(value=1e+17), input_type=LargeNumber]
But even this lets things slip through the cracks. It doesn't check the defaults:
from pydantic import BaseModel, Field, validate_call class LargeNumber(BaseModel): value: int = Field(ge=100, allow_mutation=False) # We've learned our lesson. These are immutable now. class SmallNumber(BaseModel): value: float = Field(ge=0, le=1, allow_mutation=False) @validate_call def process_small_number(small_num: SmallNumber = LargeNumber(value=1000000000)): print("This number must be pretty small, right?") print(small_num.value) process_small_number(LargeNumber(value=100)) # Fails process_small_number() # Succeeds!
And if we use inheritance and override parent methods, we can back ourselves into a corner, where the type no longer tells us what we want. What if something has a bill like a duck, but doesn't quack like a duck? What if it's some weird duck-like thing, like a platypus?
from pydantic import BaseModel, validate_call class Mammal(BaseModel): # 99.9% of mammals give live birth, so let's define this here. def give_birth(self): class_name = self.__class__.__name__ print(f"Giving birth to another {class_name}") return self.__class__() def lactate(self): ... class Gazelle(Mammal): ... class Lemur(Mammal): ... class Monotreme(Mammal): # But monotremes are weird, so let's add some special behavior. def lay_eggs(self): print("No live babies, just eggs") def give_birth(self): # Override the broader Mammal method everyone else uses self.lay_eggs() class Platypus(Monotreme): ... @validate_call def get_offspring(animal: Mammal): # No way to call this with e.g. a Reptile, so we're safe, right? return animal.give_birth() cool_lemur = Lemur() baby_lemur = get_offspring(cool_lemur) # "Giving birth to another Lemur" assert isinstance(baby_lemur, Lemur) # Yep, it's a lemur cool_platypus = Platypus() baby_platypus = get_offspring(cool_platypus) # "No live babies, just eggs" assert isinstance(baby_platypus, Platypus) # AssertionError!
How could we express that this function requires a Mammal, but *not* a monotreme? There are ways to do this, but they're awkward, and realistically we're back to explicitly checking every time, hoping we never forget. But then again, part of our problem is that we kept the give_birth()
interface, papering over an important difference. It would be better to break the interface.[5]
class Monotreme(Mammal): def lay_eggs(self): print("No live babies, just eggs") def give_birth(self): raise TypeError("Monotremes are weird.")
Likewise, if we go back to the Google example, we can break our interface so we can't use a Budget where we expect a ROAS, whether type checked or not:
from dataclasses import dataclass @dataclass(frozen=True) class ConstrainedBudget: budget_value: float # Renamed "budget_value", not "value" def __post_init__(self): if not 1 <= self.value <= 1000: raise ValueError("Budget out of range") @dataclass(frozen=True) class ConstrainedROAS: roas_value: float # Renamed roas_value def __post_init__(self): if not 0.1 <= self.value <= 2.0: raise ValueError("ROAS out of range") def set_troas(constrained_roas: ConstrainedROAS): print(constrained_roas.roas_value) pretty_sure_this_is_a_roas = ConstrainedBudget(110) set_troas(pretty_sure_this_is_a_roas) # AttributeError: 'ConstrainedBudget' object has no attribute 'roas_value'
Now, even if something slips through the type checks, it will fail fast, hopefully before we spend $45M.
This is a bit ugly. If we keep our names explicit, we end up speaking redundantly. But the advantage of being redundant is that it's much harder to get this wrong. We have to re-declare your intent at every step. This is sometimes called Syntactic Salt, playing on the idea of "Syntactic Sugar" like the @validate_call
decorator. We could just as easily say some_function = validate_call(some_function)
, but the @
syntax is cleaner and easier to type. The point of that shortcut is to encourage us to use this pattern, since it feels good to use. In contrast, redundantly specifying we want a ROAS value every time we ask a ROAS class adds friction, and makes things a little bit of a pain. But friction is not wholly negative; your car tires don't minimize friction.[6]
Is this foolproof? Python is a loose language, and there are a million crazy ways to bypass checks. You can rummage through __class__.__dict__
or or run this in an InteractiveInterpreter()
with mutable locals or something else insane, but I'm not losing sleep over that. I lose sleep over doing this carelessly. I can see myself thoughtless updating class variables in a loop, or forgetting a default argument during a refactor, or leaving off that decorator once or twice, or making a copy/paste error and accidentally putting a SmallValue in a large_values list. More importantly, I don't always know what to look out for. Was it obvious this decorator didn't check defaults? What other steps will it ignore?
Conclusion
The punchline is that in Python, the interface is the type. If we want Python to have something like type safety, we have to break the interface. Whenever you care what something is:
- Make it immutable.
- Make its interface just a bit awkward to use.
Now, let us retire to Xanadu.
-
Unfortunately, lists and sets have one major incompatibility. We
add()
things to a set, butappend()
them to a list. I'm convinced this is a design flaw. It needlessly breaks code that could easily work with either type, seemingly to prove a point. Sets aren't ordered, so you can't append something to a set, because it won't necessarily go at the end. In fact, if the set already contains the element, this wouldn't append anything at all. Yep, cool. In the real world, when weimport this
, the Zen of Python says "practicality beats purity", and this is extremely impractical. It means we can't duck type two incredibly similar things. Look, Python, you can just make sets ordered. You did it with dictionaries. I don't care that some abstract mathematical definition says they're not. The abstract mathematical definition of arithmetic says0.1 + 0.2 == 0.3
, and Python, like every other language, says it's not. Python, like every other language, says0.0
and-0.0
are distinct things. We crossed this Rubicon ages ago. We're in the reign of, like, Elagabalus. Let me write code that works please. [return] -
Why not set ROAS to 1,000,000%? Wouldn't we rather make more money than less? This is essentially a roundabout way to communicate risk tolerance. A 200% ROAS doesn't mean we will double our money; it's more like telling Google "Only show my ad if you think it will double my money". Set your ROAS too high and Google will never shows your ad, since they'll never be sure if it's worth it. Set your ROAS too low and Google will advertise pork rinds to Muslims and vegans. There's always a chance they'll convert! [return]
-
This is common when working with money. As noted above, 0.1 + 0.2 is not equal to 0.3, but 0.30000000000000004. In most cases, these inconsistencies don't matter, but people have a way of caring about edge cases when they might affect how they buy food. If you've seen Office Space, you know that these fractions of pennies add up. Computers are better at working with integers, so it's easier to fix the number of decimal places we care about and use cleaner math. [return]
-
Pydantic lets these accept integers as inputs, even though we said to use floats. On the other hand,
LargeNumber(value=4.0)
would be invalid if LargeNumber needed an int. Technically, this is a lossy conversion, since some large numbers can't be represented as floats.LargeNumber(value=9007199254740993)
will give you backLargeNumber(value=9007199254740992.0)
.[7] I could see a world where Pydantic didn't let us do this, but I think this is another case where practicality beats purity. Realistically, this will not matter for anything we'll write. If you need precision around large numbers, don't use floats! [return] -
Bob Martin would say the problem is we've violated the Liskov Substitution Principle, but that's a hard principle to follow in the best of times. Harder still when you're neck-deep in platypus eggs. [return]
-
Of course, salt usually makes things taste better. Syntactic bitterness would be a better metaphor. Syntactic grapefruit juice? [return]
-
In the strangest foreign land, JavaScript, the language doesn't even have integers. Everything is floating point, and this is an infinite loop:
for (let i = 9007199254740990; i < 9007199254740995; i++) { console.log(i) }
[return]