Peter Liniker

[ about ] - [ blog ]

Where in the world

Portland, that’s where! I went to PyCon which was the best because I’ve been programming in Python for years and had gotten pretty tired of it. PyCon helped me see all the reasons to really appreciate the design of Python and CPython and the tradeoffs they’ve made.

In particular, trading efficiency for really good C interop.

I knew this but now I have a bigger respect for the tradeoff in light of the Gilectomy and Micropython and that it is worth it. It also helped me see what I’d like out of an interpreted language written in Rust.

To be fair though, what had really wearied me with Python was working with some expansive codebases like OpenStack and SaltStack. Both systems have given me endless games of whack-a-mole, tracking down bugs that would probably not have occurred in a statically typed language. Not Python’s fault though, I just have my doubts about duck typing everywhere all the time in large codebases.

My favorite part of PyCon though? I got to see a really good friend for the first time in years and hang out with other wonderfully fun people. Portland itself though… hmmm!

What safety hole? I don’t remember…

Anyway, in Episode VI I said:

If anybody cared to look at memory.rs they might have considered that Ptr<T> references memory in an Arena but the lifetime of Ptr<T> is not limited to the lifetime of the Arena it is connected to. I’ve let some possible use-after-free unsafety leak out.

I thought about this, and tried adding an explicit lifetime to Ptr<T>, and thought about it some more. These lifetimes are viral and start cluttering everything up. I don’t like it, yet it would be the right thing to do.

I’m not going to do it.

I did it. I couldn’t let this go. It was really hard (for me.) I cried, got angry, broke up with Rust, leaned on Python’s shoulder, got back together to work through our differences, was interrupted by life a thousand times and finally copied and modified somebody else’s solution.

Here is today’s diff. I’m going to try to explain what’s going on here…

In which I explain

After failing to wrap my head around the lifetime problem through my own brain power, I went looking for repositories with allocators to see what other people had implemented.

I reviewed a good number of hobby interpreters written in Rust on GitHub, and of those that did not use Rc<T> and had implemented some kind of allocator, not one had explicitly through lifetimes umbilically tethered pointers to their mother allocator. Either I’m not the only person with potential use-after-free or they used ways I didn’t see to define a safe-Rust API that can’t accidentally leak dead object pointers. Going to assume the latter right this second.

Finally I searched for allocator crates and came to rphmeier’s allocators crate in which, joy, I found pointer and allocator types that explicitly prevent, at compile time, pointers from outliving their allocator.

I’ve taken these types and modified them to fit my use case.

My pointer type now takes an Allocator type instance and a lifetime that is bound to the Allocator and it ties the pointer type to that Allocator using std::marker::PhantomData. Thus the connection only exists at compile time and the pointer type remains just a pointer with no additional runtime baggage.

pub struct Ptr<'a, T, A: 'a + Allocator> {
    ptr: *mut T,
    _marker: PhantomData<&'a A>
}

I also created the Allocator type similarly to how allocators does it.

pub trait Allocator {
    fn alloc<T>(&self, object: T) -> Ptr<T, Self> where Self: Sized;
}

At first I made fn alloc<T>(&self...) take &mut self because it seemed logical that an allocator is a mutable thing (without noting that allocators uses &self.)

Later I realized that the system allocator behaves more like a global data structure that is implicitly mutably aliased everywhere. With that in mind, I updated Allocator::alloc() to take &self and use interior mutability instead of taking &mut self. This solved many compile errors.

(The SymbolMap type also uses interior mutability now, too, for similarish reasons.)

Still, just by adding explicit lifetimes and refactoring allocation into a trait left me with so many horrible compiler errors that I played non-duck-typing static-checking I-don’t-quite-understand whack-a-mole for some time.

Now that I look at the final diff, though, the changes are logical and consistent.

The <'a, A: 'a + Allocator> parameters and bounds, simply, are pervasive.

Up next…

Holy moly it took a long time to get here from the previous post. Worst blog series ever.

I’d like to actually begin traversing the AST and doing something with it next!