Since I’ve yet again dug myself into the attracting basin of nocturnality and sleep seems to be eluding me tonight more than ever, I guess it only makes sense to stop rolling around hopelessly and do something at least marginally productive. So, here we go. (Side note: according to this spellchecker, “nocturnality” is not a word.)

Tonight, I want to talk about free vector spaces, but first, reflect a bit on the way we formalize the notion of a “sequence of stuff” in mathematics. Note that I’m *not* using the word “sequence” in the sense of its usual, analytical meaning (that is, some countably infinite list) but instead as something much more general. Given an “indexing” set of arbitrary size (cardinality), and a bunch of objects lying within another set , how do we formally talk about some kind of “sequence” of these objects, where the “length” of it is the cardinality of ?

When you require this sequence to be a *list*, or* enumeration*, this is probably the most strict condition you could impose. The word “enumeration” suggests that not only should the indexing set be (totally?) ordered, it should also be *countable* since we want to have the notion of a “successor” to any given element. For example, if a sequence is indexed by , it certainly makes sense to ask which of two indices comes “first” (due to the ordering), but it doesn’t make sense to ask which index comes “after” a given index. On the other hand, if the indexing set is or (or even ?) it does make sense, although I guess there’s no standardized way of defining what the successor of a rational number is. Anyhow, we can also consider sequences which are indexed by uncountably infinite ordered sets — or we can forget about the ordering altogether and just have “a bunch of jumbled-up stuff, possibly with repetitions”. I guess in this case, your sequence basically breaks down to nothing more than a multiset, unless I’m mistaken.

One thing I noticed is that in combinatorics, we use *generating series* to encode a bunch of coefficients — which I think are usually indexed by . So this is one way of talking about a sequence; just use a formal sum:

It seems to me like such formal sums are really just an alternate way of representing a function , although I guess sometimes people want to talk about term-by-term differentiation and so on, and the “formal sum” setting just makes that kind of thing more intuitive. The reason I say this is because when a sum is “formal” we usually have little concern with actually adding the terms; it could be a formal product for all we care, and we could write a instead. If we actually were *using* any kind of addition operation, then assuming is infinite, we would have to worry about convergence of the sum, but convergence only makes sense when is endowed with some kind of metric (or at least topological) structure.

Now that I’ve babbled on and on about encoding sequences (or I suppose “streams” would be a better name, depending on how thick of an indexing set you’re working with), I’ll finally say a bit about the usage of the term “free”. I was just thinking about this earlier: loosely speaking, it seems like we can pretty much bastardize every vector space out there to a space of formal sums. That is, given a vector space over , just choose a basis for (Axiom of Choice believers only, of course). We know that everything in can be represented as a finite sum of stuff from . In other words we can use this basis to construct an isomorphism from into a “space of formal sums” which each look like

and all but finitely many are zero.

On the other hand, given a set we can construct something known as the *free vector space* over , which seems to usually be denoted . To do this, we formalize our “formal sums” (oh, the irony) as functions where for only finitely many , and we let our vector space consist precisely of all such . Define addition and scalar multiplication in the usual way. Indeed, we can easily construct a basis for this space, merely by defining for each the function given by , and for all . Now just put . Certainly then, any function , due to its being nonzero on only finitely many elements in , can be represented as a *linear combination* of the elements in . Moreover, linear independence is almost immediate from the definition.

The term *free* also appears in module theory. Modules are structures like vector spaces, but we drop the requirement that the scalars come from a field, instead allowing them to come from any ring. Rings, of course, don’t necessarily furnish multiplicative inverses to your heart’s desire, nor are they necessarily commutative. So you have to worry about “left” and “right” modules, and so on. Anyways, the punch line is that modules (unlike vector spaces) don’t in general always have a basis. The ones that do are called *free* modules. Lastly, it appears in category theory as well, but I don’t know nearly enough to comment on that — as if commenting on module theory wasn’t far enough out of my league… 🙄