GNU Unifont 16.0.02
Pan-Unicode font with complete Unicode Plane 0 coverage and partial coverage of higher planes
|
: Support functions for Unifont .hex files. More...
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
Go to the source code of this file.
Functions | |
void | parse_hex (char *hexstring, int *width, unsigned *codept, unsigned char glyph[16][2]) |
Decode a Unifont .hex file into Uniocde code point and glyph. More... | |
void | glyph2bits (int width, unsigned char glyph[16][2], unsigned char glyphbits[16][16]) |
Convert a Unifont binary glyph into a binary glyph array of bits. More... | |
void | hexpose (int width, unsigned char glyphbits[16][16], unsigned char transpose[2][16]) |
Transpose a Unifont .hex format glyph into 2 column-major sub-arrays. More... | |
void | glyph2string (int width, unsigned codept, unsigned char glyph[16][2], char *outstring) |
Convert a glyph code point and byte array into a Unifont .hex string. More... | |
void | xglyph2string (int width, unsigned codept, unsigned char transpose[2][16], char *outstring) |
Convert a code point and transposed glyph into a Unifont .hex string. More... | |
: Support functions for Unifont .hex files.
Definition in file unifont-support.c.
void glyph2bits | ( | int | width, |
unsigned char | glyph[16][2], | ||
unsigned char | glyphbits[16][16] | ||
) |
Convert a Unifont binary glyph into a binary glyph array of bits.
This function takes a Unifont 16-row by 1- or 2-byte wide binary glyph and returns an array of 16 rows by 16 columns. For each output array element, a 1 indicates the corresponding bit was set in the binary glyph, and a 0 indicates the corresponding bit was not set.
[in] | width | The number of columns in the glyph. |
[in] | glyph | The binary glyph, as a 16-row by 2-byte array. |
[out] | glyphbits | The converted glyph, as a 16-row, 16-column array. |
Definition at line 91 of file unifont-support.c.
void glyph2string | ( | int | width, |
unsigned | codept, | ||
unsigned char | glyph[16][2], | ||
char * | outstring | ||
) |
Convert a glyph code point and byte array into a Unifont .hex string.
This function takes a code point and a 16-row by 1- or 2-byte binary glyph, and converts it into a Unifont .hex format character array.
[in] | width | The number of columns in the glyph. |
[in] | codept | The code point to appear in the output .hex string. |
[in] | glyph | The glyph, with each of 16 rows 1 or 2 bytes wide. |
[out] | outstring | The output string, in Unifont .hex format. |
Definition at line 221 of file unifont-support.c.
void hexpose | ( | int | width, |
unsigned char | glyphbits[16][16], | ||
unsigned char | transpose[2][16] | ||
) |
Transpose a Unifont .hex format glyph into 2 column-major sub-arrays.
This function takes a 16-by-16 cell bit array made from a Unifont glyph (as created by the glyph2bits function) and outputs a transposed array of 2 sets of 8 or 16 columns, depending on the glyph width. This format simplifies outputting these bit patterns on a graphics display with a controller chip designed to output a column of 8 pixels at a time.
For a line of text with Unifont output, first all glyphs can have their first 8 rows of pixels displayed on a line. Then the second 8 rows of all glyphs on the line can be displayed. This simplifies code for such controller chips that are designed to automatically increment input bytes of column data by one column at a time for each successive byte.
The glyphbits array contains a '1' in each cell where the corresponding non-transposed glyph has a pixel set, and 0 in each cell where a pixel is not set.
[in] | width | The number of columns in the glyph. |
[in] | glyphbits | The 16-by-16 pixel glyph bits. |
[out] | transpose | The array of 2 sets of 8 ot 16 columns of 8 pixels. |
Definition at line 150 of file unifont-support.c.
void parse_hex | ( | char * | hexstring, |
int * | width, | ||
unsigned * | codept, | ||
unsigned char | glyph[16][2] | ||
) |
Decode a Unifont .hex file into Uniocde code point and glyph.
This function takes one line from a Unifont .hex file and decodes it into a code point followed by a 16-row glyph array. The glyph array can be one byte (8 columns) or two bytes (16 columns).
[in] | hexstring | The Unicode .hex string for one code point. |
[out] | width | The number of columns in a glyph with 16 rows. |
[out] | codept | The code point, contained in the first .hex file field. |
[out] | glyph | The Unifont glyph, as 16 rows by 1 or 2 bytes wide. |
Definition at line 44 of file unifont-support.c.
void xglyph2string | ( | int | width, |
unsigned | codept, | ||
unsigned char | transpose[2][16], | ||
char * | outstring | ||
) |
Convert a code point and transposed glyph into a Unifont .hex string.
This function takes a code point and a transposed Unifont glyph of 2 rows of 8 pixels in a column, and converts it into a Unifont .hex format character array.
[in] | width | The number of columns in the glyph. |
[in] | codept | The code point to appear in the output .hex string. |
[in] | transpose | The transposed glyph, with 2 sets of 8-row data. |
[out] | outstring | The output string, in Unifont .hex format. |
Definition at line 267 of file unifont-support.c.