Understanding the GSM Um Layer 1
More or less involuntarily I ended up spending the better part of the last couple of days understanding all the nasty details of the details about the Layer 1 of the GSM Um (MS-BTS) interface, i.e. the RF interface between your mobile phone and the BTS.
You can find a lot of secondary information about GSM both online and offline, but they all seem to address other issues, like layer 2 and layer3 protocols, even the actual logical channel multiplex (at least to some extent), network planning, etc. - But I've not found anything reasonably detailed about the actual channel coding, TDMA and synchronization, ie. what happens after demodulation and before layer 2. So I had to read the specs. Oh well...
I actually thought that all those parts are already there (in gssm and gsm-tvoid), but in reality that was just something like a proof-of-concept implementation. Restricted to only work on Timeslot 0, not demuxing the individual parts of the CCCH (PCH/AGCH/BCCH/..) and not properly dealing with frame-number counting in case of various receiver overflows or the like.
So I've now started on a new codebase specifically focussed on everything that happens between the differentially decoded GSM bursts and layer 2.
Normally, one would not assume a layer1 to be particularly complex. However, in the case of GSM, it really is. There are various different physical channel configurations, each with their own nasty little differences in the channel coding schemes... plus the synchronous nature with its necessity to know a lot of context and correctly count frames in order to correctly interpret and decode the incoming bits.
So where do we start. Most readers of this blog will know that GSM has 8 timeslots on each RF channel. Each timeslot forms one so-called physical channel, which can assume one of many physical channel configurations.
A physical channel configuration determines not only the particular logical channels residing in the physical channel, but also low-level aspects such as parity, convolutional code, training sequence, tail bits, etc.
In every timeslot on a physical channel we transmit something called a burst. Depending on the physical channel configuration, there can be different bursts:
-
Normal: A regular burst transmitting 142 bits of data
-
Frequency Correction: A burst that helps us finding the offset between sender and receiver clock rate, compensate for temperature drift, etc.
-
Synchronization: A burst for achieving TDMA-frame-number synchronization
-
Access: A specially (short) burst used in the uplink (MS->BTS) direction to request a channel allocation by the BTS.
-
Dummy: A burst that is transmitted in otherwise unused/empty timeslots, only on the ARFCN (radio frequency channel) that carries the CCCH/BCCH
All 8 successive timeslots form what is called a TDMA Frame. Every time the timeslot number changes from 7 to 0, the TDMA Frame Number is incremented by the receiver.
For all control channels, four such bursts need to be concatenated and go through convolutional decode and parity to form a 23-byte MAC block. On most, but not all control channels, those four bursts are sent in successive TDMA frames on one timeslot. The notable exception is the SACCH (Slow Associated Control Channel) which accommodates every TCH/F (Traffic Channel) and happens to get one burst at the 13th, 101st, 114th, 202nd, ... TDMA frame. Thus, on such TCH/F channels you need to wait for 202 TDMA frames to obtain the four bursts needed to generate one MAC block. A MAC block is what is then passed up to the layer 2.
Timeslot 0 of the primary radio channel of the BTS is called CCCH (Common Control Channel). It is the only channel about which we can at least make some assumptions about its physical configuration. However, the CCCH carries a number of different channels, such as the BCCH (Broadcast Control Channel), PCH (Paging Channel), RACH (Random Access Channel), SCH (Synchronization Channel), FCCH (Frequency Correction Channel) and possibly a NCH and maybe even a SDCCH/8 multiplexed into it. The "problem" here is that the actual configuration of the logical channels is determined by a layer2 System Information Type 3 message. So you have to work with the minimal guarantees (that there will be a FCCH burst, followed by a SCH burst, followed by four BCCH bursts which give one MAC block, in which you will at some point find the SI Type 3 message telling you how all the other bursts of the CCCH are mapped onto the various other logical channels.
From the System Information messages you can also learn on which timeslot the CBCH (Cell Broadcast Channel) is located. Typically, it is part of a SDCCH/8+SACCH/8C physical configuration. On all the BTS's I've seen protocol traces of (which are not yet that many), this is on Timeslot 1.
A SDCCH/8+SACCH/8C configuration is another weird but more logical multiplex. It consists out of 32 consecutive bursts, four for each of the 8 SDCCH instances that it carries. The following 16 bursts are divided to four, and they alternately serve SACCH0..3 or SACCH4..7. So you basically end up with two 51 multiframes , i.e. 102 bursts, out of which 8 bursts go to each of the SDCCH/8 instances, and four go to each SACCH/8C instance. (102-06 = 6, i.e. 3 bursts are dummy in each 51 multiframe.
For the TCH/F (full-rate traffic channel), the entire viterbi decode and parity is different than the control channels... and which is probably going to be part of another blog post.
Oh, in case you're interested: you can find the unfinished demultiplex code snippets at this git tree
Luckily, with a few days more work, we can actually properly demultiplex and decode all incoming frames, learn about the [dynamic] channel assignments and configurations that the BTS does, and reconstruct at least the MAC blocks for all control channels, if not even the actual speech codec data of the TCH's (on those unencrpted BTS that exist in India). However, actually following a conversation is not possible (and not of interest to me anyway) due to the frequency hopping.