Jun 21 2014

## Building a CPU Part I – Implementing logic gates

For a long time now, I was thinking to myself: “I know how to develop softwares, And I also have a little bit of practice with drivers and compiling operation systems (I even tried to write my own code to a boot sector and run it), But I never really understood how CPU works”.

So I’ve decided to take matter into hands and try to build a CPU. At first I thought about creating it all from scratch, But quick enough I discovered that the amount of transistors I will have to solder is enormous (I will soon enough show that even a simple NAND gate requires 4 transistors). So I’ve decided to build it on FPGA which will make life easier, and instead of creating my own architecture, I’ve decided to try and mimic the 6502 microprocessor, which is the CPU used by NES. If everything will go according to plan, I will be able to play Super-Mario on my own CPU.

The first post will have nothing to do with FPGA, as logic gates are provided out-of-the-box when using FPGA, so this part will have nothing to do with my implementation of the 6502 microprocessor. But according to my belief this part is the most important one, because the logic gates are the basic building blocks of any electronic device, in particular a CPU.

Feb 5 2014

## Introduction to Lie Algebras

In my last post I’ve described the structure called Algebra. Although this structure has some other uses, I wanted at first to define a new structure, called Lie Algebras.

Lie Algebras is a very useful structure, and it is being repeatedly used in Quantum mechanics and Analytical mechanics.

Definition:Lie algebra is an algebra with the operation (usually denoted by ) that satisfying the following two axioms:

1. Anti-commutativity: .
2. Jacobi identity: .

One can easily show that a subalgebra or a quotient algebra of a Lie algebra is a Lie algebra.

Using the definition we can prove a quick lemma:

Lemma: Let be a Lie Algebra (over a field , and let . Then:

Proof: From the bi-linearity of the operation we have:

If the characteristics of the field is not 2 (we will always assume that), from the first axiom of Lie Algebras we know that:

Thus:

as required. ■

We can also define the center of an Lie algebra (also of a general algebra, but that’s not the case here) as follows:

Definition: The center of a Lie Algebra is the set:

Note that in the case of Lie Algebra, because of the lemma we’ve proven, we can conclude that:

### Examples

1. Take any vector space , and define that operation (usually called the bracket) to be: . This is a commutative (also called abelian) Lie algebra. And of course the center of is itself.
2. , (vector product). In this case the center of is zero.
3. Let be an associative algebra with the product (For example the algebra of matrices, The the space with the bracket is a Lie algebra denoted by or by

Moreover, In Analytical mechanics, for the physicists readers, you’ve met the Poisson Brackets which maintains the Jacobi identity. And in quantum mechanics, physicist always talking about the commutator of two operators (For example the Hamiltonian and the momentum). And theorems from Lie algebra can easily be applied to these subject, and it’s another way of showing how we can learn facts about the universe simply from playing with math.

This site is hosted by:

Feb 5 2014

## Algebras

It’s been a long time since my last post about mathematics, which is kind of a shame, because I wanted math to be one of the main subjects of this blog.
So, I decided to start and write a little bit more about things I loved in mathematics (and hopefully I will follow).

In this post, I’m assuming some basic Linear Algebra knowledge. This post is a little boring, but it defines an important algebraic structure that is used in some very beautiful subjects and theorems, so stay tuned.

### Algebras and Subalgebras

Definition: An algebra is a vector space over field , endowed with a binary bilinear operation () s.t. :

For example, The polynomials with variables, , with the multiplication as the operation is an associative and commutative algebra. This algebra is usually called the Polynomial algebra.

But please note, NOT all algebras must be commutative! For example, we can look at the vector space of the matrices (), with matrix multiplication at the operation . is an associative, non-commutative algebra.

We can also define a subalgebra:

Definition: A subspace is called a subalgebra if .

For example, the diagonal matrices are a subalgebra of the matrices algebra.

#### Homomorphism

And just like any other algebraic structure, we can define an homomorphism between two algebras as a linear map between the algebras the preserve the operation, that is:

Defenition: Let be algebras, a linear map is called an homomorphism if:

### Ideals

Definition: A subspace of an algebra is a left (resp. right, resp. two-sided) ideal if:

(resp. , resp. ).

It is clear, by the definition, that any ideal is a subalgebra.

Using the ideals, we can look at the quotient space (a space where any vector in I means the zero vector, meaning that two vector that differ by only a vector in I are identical). The quotient space has the canonical algebra structure given by:

which called the quotient algebra .
It’s easy to see that the canonical map that is given by is an algebra homomorphism.

Moreover, if is an algebra homomorphism, the kernel is a two-sided ideal of and the image is a subalgebra of .