# Thread: Boolean Algebra is Wrong

1. ## Boolean Algebra is Wrong

OK wrong is the wrong word, I just don't like how it's described. It's killing me since we're going over boolean algebra and logic gates in class, so I'm going to explain what I think is a better basis to you fine folk.

Classical Boolean Algebra

Classical Boolean algebra is defined by 4 things: a set of elements, 2 binary operations (operations on 2 elements of the set; returns 1 element), and one unary operation (an operation on 1 element of the set; returns 1 element). The set is {0,1}: everything we ever work with in boolean algebra will either be 0 or 1. The two binary operations are OR and AND. The unary operation is NOT. They are defined as follows:

OR: True IFF any one input is True; 1 iff any one input is 1.

AND: True IFF all inputs are True; 1 iff both inputs are 1.

NOT: True IFF the input is false; 1 iff the input is 0.

A B A OR B A AND B
0 0 0 0
0 1 1 0
1 0 1 0
1 1 1 1

I talked a fair bit about identities in another math-y thread, and the concept carries over. OR has an identity of 0: for every possible value of A, A OR 0 = A. AND has an identity of 1: for every possible value of A, A AND 1 = A.
There's an obviously analogy to be made, here: OR is somewhat additive. It has the same identity as addition. AND is multiplicative: it has the same identity as multiplication. For this reason, A OR B is often represented as A + B, and AND is often represented by A x B, or AB.

Think of the multiplication property where 0 swallows everything: no matter what, any chain of multiplication that contains at least one 0 will evaluate to 0. The same is true of AND. No matter how many AND operations you chain together, as long as one input somewhere along the chain is false, the whole thing evaluates to false, or 0. AND is very multiplicative. It's actually identical to normal multiplication on the set {0,1}.

OR isn't quite so additive. It has the same identity, sure, and 0 OR 1 = 1 much like 0 + 1 = 1, but 1 OR 1 = 1. In that above post I linked, I talk about inverses, and OR isn't fully invertable on the set {0,1}. There's no element A such that 1 OR A = 0. You can't "undue" A OR 1 operation to find A. 1 is to OR what 0 is to AND: if you have at least one 1 in your chain of ORs, the whole thing evaluates to 1, no matter the rest (N.B. there's a term for elements that swallow all other elements when used as part of a certain operation, but I can't remember it). It's not quite as additive as AND is multiplicative.

That brings us to NOT (NOT is usually represented as an overhead bar, I'm using a ' to represent it instead):

A A'
0 1
1 0

NOT doesn't really have any analogy in normal algebra. It's not a binary operation so it has no identity.

Now each of these operations have a bunch of properties that come along with them. AND and OR are commutative; they're associative; they're distributive (which is unique for OR: additive operations usually aren't distributive the same way multiplicative operations are), etc. From these properties you can derive a bunch of theorems about Boolean algebra. For example (using + and x to represent OR and AND):

A + A' = 1 (at least one of A or A' must be 1)
A x A' = 0 (at least one of A or A' must be 0)
(A')' = A
(A + B)' = A' x B'
(A x B)' = A' + B'

Those last two are called DeMorgan's Laws, or DeMorgan's theorem, and basically they establish that you can define OR in terms of AND and NOT, and you can define AND in terms of OR and NOT. If you take NOT of both sides:
A + B = (A' x B')'
A x B = (A' + B')'

Now this is slightly odd: OR and AND are supposed to be the basis of our algebra; they're supposed to be fundamental units of it. If I can describe one in terms of the other, they're not really that fundamental. We can define our algebra with just one, along with NOT. It kinda makes one wonder which one is REALLY fundamental, which one we should base everything on if we were to reduce our basis to as few axioms as possible.

(HINT: OR should be tossed as a fundamental operation)

MOD 2 Arithmetic

There's another form of algebra that works on the set {0,1}: it's called MOD 2 arithmetic. Under MOD 2 arithmetic, every number gets replaced with it's remainder after dividing by 2. Essentially, ever even number becomes 0 and every odd number becomes 1. Addition and multiplication work the same way, except you take MOD 2 of the results. So 1 x 1 = 1, 1 + 0 = 1, 0 + 0 = 0, all as expected. The only odd case is 1 + 1. Under normal arithmetic, that'd be 2, but if we take MOD 2 after adding, the result is 0. 1 + 1 = 0 under MOD 2 arithmetic. The tables describing multiplication and addition are as follows:

A B A + B A x B
0 0 0 0
0 1 1 0
1 0 1 0
1 1 0 1

Multiplication behaves exactly as expected, so there's nothing new to consider here. The real question is about addition: does it have an identity, and is it fully invertable like standard addition? Here again, MOD 2 addition has an identity of 0: for all A, A + 0 = A. But MOD 2 addition is also fully invertable: the inverse of 1 is 1. For all A, A + A = 0 (1 + 1 = 0, and 0 + 0 = 0, the latter of which is trivial since identities are always invertable by themselves). Despite the slight oddity of 1 + 1 = 0, MOD 2 addition is much closer to regular addition than the classical boolean OR operation.

Bringing it home

MOD 2 addition is identical to boolean XOR:

XOR: True IFF only 1 input is true; 1 iff only one input is 1

If you build the truth table for XOR, it's exactly the same as the table for MOD 2 addition. If we make an analogy between MOD 2 arithmetic and the sort of algebraic structure we want for boolean algebra, it implies two basic operations: XOR and AND, represented by + and x. But boolean algebra is defined around AND, OR, and NOT: how do we get the latter 2 from XOR and AND?

NOT, under XOR, is XOR 1, or + 1:

A' = A + 1 (0 + 1 = 1; 1 + 1 = 0).

Any time we want to take NOT of something, just add 1.

OR is slightly more complicated. Now that we have a NOT operation, we can use DeMorgan's Laws to derive OR:

A OR B = (A' x B')'
A OR B = (A + 1)(B + 1) + 1
A OR B = AB + A + B + 1 + 1
A OR B = AB + A + B

Using XOR as our additive basis, OR is somewhat more complicated to express using arithmetic notation. But remember that OR has a bunch of properties that normal addition doesn't: it's distributive, and isn't fully invertable. That's to be expected.

Using XOR and AND as our additive and multiplicative basis, we can derive any function or operation you can imagine in boolean algebra. The entirety of boolean algebra is contained within MOD 2 arithmetic. Every thereom about AND, OR, and NOT under boolean algebra remains true as they are defined here. MOD 2 arithmetic (by analogy, XOR and AND) makes a better basis for boolean algebra, being functionally identical, having no redundant axioms, and being fully analogous to classical algebra, even the NOT operation as a simple +1.

2. ## Re: Boolean Algebra is Wong

Originally Posted by crayzz
...if we were to reduce our basis to as few axioms as possible.
If that's our goal, we can chop it down to a single operator, NAND:

A B A NAND B
0 0 1
0 1 1
1 0 1
1 1 0
We can show this is complete by showing we can build NOT and OR out of it, as follows:

A' = A NAND A
A OR B = (A NAND A) NAND (B NAND B)

Note, however, that minimal axiomization is not the only goal we might reasonably have. One of the big values of Classical Boolean Algebra is how it maps into other spaces. Consider set theory: if we have a set S and all of its subsets, including the empty set E, there's a direct relationship between Union and OR, Intersection and AND, and Complement and NOT. In particular, if we set S=1 and E=0, we get exactly the same behaviors from these operators. As with so many things, the representation(and axiomization) we wish to use depends on our overall objectives.

3. ## Re: Boolean Algebra is Wong

Originally Posted by crayzz
OK wrong is the wrong word, I just don't like how it's described. It's killing me since we're going over boolean algebra and logic gates in class, so I'm going to explain what I think is a better basis to you fine folk.

Classical Boolean Algebra

Classical Boolean algebra is defined by 4 things: a set of elements, 2 binary operations (operations on 2 elements of the set; returns 1 element), and one unary operation (an operation on 1 element of the set; returns 1 element). The set is {0,1}: everything we ever work with in boolean algebra will either be 0 or 1. The two binary operations are OR and AND. The unary operation is NOT. They are defined as follows:

OR: True IFF any one input is True; 1 iff any one input is 1.

AND: True IFF all inputs are True; 1 iff both inputs are 1.

NOT: True IFF the input is false; 1 iff the input is 0.

A B A OR B A AND B
0 0 0 0
0 1 1 0
1 0 1 0
1 1 1 1

I talked a fair bit about identities in another math-y thread, and the concept carries over. OR has an identity of 0: for every possible value of A, A OR 0 = A. AND has an identity of 1: for every possible value of A, A AND 1 = A.
There's an obviously analogy to be made, here: OR is somewhat additive. It has the same identity as addition. AND is multiplicative: it has the same identity as multiplication. For this reason, A OR B is often represented as A + B, and AND is often represented by A x B, or AB.

Think of the multiplication property where 0 swallows everything: no matter what, any chain of multiplication that contains at least one 0 will evaluate to 0. The same is true of AND. No matter how many AND operations you chain together, as long as one input somewhere along the chain is false, the whole thing evaluates to false, or 0. AND is very multiplicative. It's actually identical to normal multiplication on the set {0,1}.

OR isn't quite so additive. It has the same identity, sure, and 0 OR 1 = 1 much like 0 + 1 = 1, but 1 OR 1 = 1. In that above post I linked, I talk about inverses, and OR isn't fully invertable on the set {0,1}. There's no element A such that 1 OR A = 0. You can't "undue" A OR 1 operation to find A. 1 is to OR what 0 is to AND: if you have at least one 1 in your chain of ORs, the whole thing evaluates to 1, no matter the rest (N.B. there's a term for elements that swallow all other elements when used as part of a certain operation, but I can't remember it). It's not quite as additive as AND is multiplicative.

That brings us to NOT (NOT is usually represented as an overhead bar, I'm using a ' to represent it instead):

A A'
0 1
1 0

NOT doesn't really have any analogy in normal algebra. It's not a binary operation so it has no identity.

Now each of these operations have a bunch of properties that come along with them. AND and OR are commutative; they're associative; they're distributive (which is unique for OR: additive operations usually aren't distributive the same way multiplicative operations are), etc. From these properties you can derive a bunch of theorems about Boolean algebra. For example (using + and x to represent OR and AND):

A + A' = 1 (at least one of A or A' must be 1)
A x A' = 0 (at least one of A or A' must be 0)
(A')' = A
(A + B)' = A' x B'
(A x B)' = A' + B'

Those last two are called DeMorgan's Laws, or DeMorgan's theorem, and basically they establish that you can define OR in terms of AND and NOT, and you can define AND in terms of OR and NOT. If you take NOT of both sides:
A + B = (A' x B')'
A x B = (A' + B')'

Now this is slightly odd: OR and AND are supposed to be the basis of our algebra; they're supposed to be fundamental units of it. If I can describe one in terms of the other, they're not really that fundamental. We can define our algebra with just one, along with NOT. It kinda makes one wonder which one is REALLY fundamental, which one we should base everything on if we were to reduce our basis to as few axioms as possible.

(HINT: OR should be tossed as a fundamental operation)

MOD 2 Arithmetic

There's another form of algebra that works on the set {0,1}: it's called MOD 2 arithmetic. Under MOD 2 arithmetic, every number gets replaced with it's remainder after dividing by 2. Essentially, ever even number becomes 0 and every odd number becomes 1. Addition and multiplication work the same way, except you take MOD 2 of the results. So 1 x 1 = 1, 1 + 0 = 1, 0 + 0 = 0, all as expected. The only odd case is 1 + 1. Under normal arithmetic, that'd be 2, but if we take MOD 2 after adding, the result is 0. 1 + 1 = 0 under MOD 2 arithmetic. The tables describing multiplication and addition are as follows:

A B A + B A x B
0 0 0 0
0 1 1 0
1 0 1 0
1 1 0 1

Multiplication behaves exactly as expected, so there's nothing new to consider here. The real question is about addition: does it have an identity, and is it fully invertable like standard addition? Here again, MOD 2 addition has an identity of 0: for all A, A + 0 = A. But MOD 2 addition is also fully invertable: the inverse of 1 is 1. For all A, A + A = 0 (1 + 1 = 0, and 0 + 0 = 0, the latter of which is trivial since identities are always invertable by themselves). Despite the slight oddity of 1 + 1 = 0, MOD 2 addition is much closer to regular addition than the classical boolean OR operation.

Bringing it home

MOD 2 addition is identical to boolean XOR:

XOR: True IFF only 1 input is true; 1 iff only one input is 1

If you build the truth table for XOR, it's exactly the same as the table for MOD 2 addition. If we make an analogy between MOD 2 arithmetic and the sort of algebraic structure we want for boolean algebra, it implies two basic operations: XOR and AND, represented by + and x. But boolean algebra is defined around AND, OR, and NOT: how do we get the latter 2 from XOR and AND?

NOT, under XOR, is XOR 1, or + 1:

A' = A + 1 (0 + 1 = 1; 1 + 1 = 0).

Any time we want to take NOT of something, just add 1.

OR is slightly more complicated. Now that we have a NOT operation, we can use DeMorgan's Laws to derive OR:

A OR B = (A' x B')'
A OR B = (A + 1)(B + 1) + 1
A OR B = AB + A + B + 1 + 1
A OR B = AB + A + B

Using XOR as our additive basis, OR is somewhat more complicated to express using arithmetic notation. But remember that OR has a bunch of properties that normal addition doesn't: it's distributive, and isn't fully invertable. That's to be expected.

Using XOR and AND as our additive and multiplicative basis, we can derive any function or operation you can imagine in boolean algebra. The entirety of boolean algebra is contained within MOD 2 arithmetic. Every thereom about AND, OR, and NOT under boolean algebra remains true as they are defined here. MOD 2 arithmetic (by analogy, XOR and AND) makes a better basis for boolean algebra, being functionally identical, having no redundant axioms, and being fully analogous to classical algebra, even the NOT operation as a simple +1.

Looking for attention? Shouldn't this be posted on a pro math site?

4. ## Re: Boolean Algebra is Wong

Yes, of course. It's well known that Boolean algebra can be reduced to one operation. I once had a textbook that reduced it all to the operation "if-then" and the rule of modus ponens. That was convenient for theorem proving.

It is not, however, convenient for solving logic problems, so for that purpose, we use and, or, iff, nand, etc.

5. ## Re: Boolean Algebra is Wong

Originally Posted by Jay R
Yes, of course. It's well known that Boolean algebra can be reduced to one operation. I once had a textbook that reduced it all to the operation "if-then" and the rule of modus ponens. That was convenient for theorem proving.

It is not, however, convenient for solving logic problems, so for that purpose, we use and, or, iff, nand, etc.
I'm completely failing to understand the problem as well. Are you claiming that finite fields are a problem? While you are unlikely to encounter them in undergraduate studies, they are pretty important in things like error-correction (reed-solomon uses them) and public key encryption (RSA uses simple ones, the ones used by elliptical curves are not so simple).

Simply put, a function is a mapping from one set of inputs to another. With normal integer functions you may have an infinite "truth table", but you can still write the table for finite numbers of inputs (and theoretically can with the integers, I'm sure that matters in some proofs). Not all math uses infinite inputs (you mentioned mod, math over a field mod "something (typically prime)" is an early favorite to learn) and some of them are important.

Look up "clock math" for the most basic introduction to finite fields (I'm not sure it really counts as a "field", reed-solomon and ecc were a bit much for me). Then realize that RSA works pretty much on the same principles, only for rather large "clocks".

PS: I'd recommend fixing the title. Considering that binary numbers have been known to China since at least the I-Ching, that typo is pretty inappropriate.

6. ## Re: Boolean Algebra is Wong

Originally Posted by FreddyNoNose
Looking for attention? Shouldn't this be posted on a pro math site?
A forum called "Mad Science and Grumpy Technology" is as good a place as any for this sort of thing. It's unfair to accuse anyone of seeking attention in a post; by participating here, you (and I for that matter) are doing this as well.

7. ## Re: Boolean Algebra is Wrong

Originally Posted by theNater
If that's our goal, we can chop it down to a single operator, NAND:

A B A NAND B
0 0 1
0 1 1
1 0 1
1 1 0
We can show this is complete by showing we can build NOT and OR out of it, as follows:

A' = A NAND A
A OR B = (A NAND A) NAND (B NAND B)

Note, however, that minimal axiomization is not the only goal we might reasonably have. One of the big values of Classical Boolean Algebra is how it maps into other spaces. Consider set theory: if we have a set S and all of its subsets, including the empty set E, there's a direct relationship between Union and OR, Intersection and AND, and Complement and NOT. In particular, if we set S=1 and E=0, we get exactly the same behaviors from these operators. As with so many things, the representation(and axiomization) we wish to use depends on our overall objectives.
That's actually cool, I didn't know that.

You're right, of course, that minimal axiomization isn't necessarily desireable: these frameworks are built to be useful, and their usefulness is the arbiter of their value.

The mapping from boolean algebra to operators on sets is a good point (e.g. DeMorgan's laws are actually laws about set operators, etc). Boolean algebra is practically just standard set operators on the set {{},{1}}, so we automatically get tools from set theory that way. We lose that if we ground our framework in modulus arithmetic.

Originally Posted by wumpus
Look up "clock math" for the most basic introduction to finite fields (I'm not sure it really counts as a "field", reed-solomon and ecc were a bit much for me). Then realize that RSA works pretty much on the same principles, only for rather large "clocks".
Clock math is actually what I'm talking about! It's the same idea (modulus arithmetic).

The fact that it's actually a field (e.g. fully invertable over addition) is the main advantage I feel MOD 2 arithmetic has over boolean algebra, on aesthetic grounds if nothing else (I do actually think inversion is a useful tool for manipulation of expressions). It behaves more like the sort of algebra you're probably already used to. Being able to express NOT in terms of addition alone is also pretty useful IMO.

Originally Posted by wumpus
PS: I'd recommend fixing the title. Considering that binary numbers have been known to China since at least the I-Ching, that typo is pretty inappropriate.
I just noticed the typo. Unfortunately, I don't know how to fix it for the thread in general (though I fixed it for my reply here).

Originally Posted by Leewei
A forum called "Mad Science and Grumpy Technology" is as good a place as any for this sort of thing. It's unfair to accuse anyone of seeking attention in a post; by participating here, you (and I for that matter) are doing this as well.
Yeah, I mean: of course I'm looking for attention. I'm looking for engagement on a particular topic. Given the math focused threads in the past, and the general impression I have of how well versed various forum members are in math (not to mention the fact that I just like it here), this seemed like a good place for that sort of thing.

In the end, OR, AND, and NOT probably are better bases for boolean algebra: if nothing else, it's easier to build a physical OR gate (just let all the inputs go through the gate) than it is to build a XOR gate. OR is probably slightly easier to grasp at first, too. It's probably more practical to think of OR as a fundamental operation.

8. ## Re: Boolean Algebra is Wrong

Originally Posted by crayzz
Unfortunately, I don't know how to fix it for the thread in general (though I fixed it for my reply here).
Edit the first post, and change its title.

GW

9. ## Re: Boolean Algebra is Wrong

There's no reason to call NOT, OR and AND (or NOT, XOR and AND) "the three fundamental operations of Boolean algebra". You can just as well say that it has one fundamental operation, or 16. And if you want to reduce it to just one, you have multiple choices, including both NAND and NOR.

For completeness, the list of all 16 binary operators:

NO
AND
LT ("less than")
NA ("not A")
GT ("greater than")
NB ("not B")
XOR
NAND
AND
NXOR
B
NGT ("not greater than")
A
NLT ("not less than")
OR
YES

I know there are exactly 16, because there are four elements in the truth table, and so there are 16 possible truth tables. "NO" always returns 0, regardless of inputs, and "YES" always returns 1. "A" always returns the value of its first argument, regardless of the second, and "B" always returns the value of the second, regardless of the first.

10. ## Re: Boolean Algebra is Wrong

It's NOR gates all the way down.

11. ## Re: Boolean Algebra is Wrong

Computers often use equality to 0 for false and inequality for true.
Which gets + back to being potentially an or operator

I suspect actual implementations use bitwise binary or with bitwise binary and, because why would you not*.
(but provided you avoided it ending up as a multiple of the integer sizes, I think it would be compliant with some standards, if say your architecture only had arithmetic operations as primitives)
.

[* For that matter addition (and in modern circuits multiplication) is constructed from the logic gates. And it's rather easy to design a circuit to implement bitwise binary operators from logical binary operators]

Edit 2:
I think, less than, etc... should also be a=>b b=>a a=>'b b=>'a (in some order)

12. ## Re: Boolean Algebra is Wronskian

Originally Posted by wumpus
Simply put, a function is a mapping from one set of inputs to another. With normal integer functions you may have an infinite "truth table", but you can still write the table for finite numbers of inputs (and theoretically can with the integers, I'm sure that matters in some proofs). Not all math uses infinite inputs (you mentioned mod, math over a field mod "something (typically prime)" is an early favorite to learn) and some of them are important.
That is one of the definitions of function (the literature has multiple, depending on where you come from, Lambda Calculus, Logic, Set Theory and so on, they are not equivalent under very particular and delicate cases). The Set Theoretical definition of a relationship where the first coordinate does not repeat has the problem that it is blind to co-domain which means the "same" function can be biyective and not-biyective simultaneously depending on the co-domain for example.

It may look pedantic but I feel it's an addenda to the conceptual issues OP is having. Each of those definitions has subtleties which make a difference in very delicate and specific use cases. Which definition you take is a matter of both practicality and to some degree personal preference. In CSci the idea of tracking domain is extremely cumbersome because constructing functions by slowly expanding the domain is natural for computable functions and many of the tools you build allow you to do that naturally. In other areas of math this definition is an atrocity because it's even worse than thinking a curve is its image, you are identifying functions too loosely as pair of points, without realizing those points may be the same underlying set but in different metrics or heck, topologies.

Unlike the definitions for Boolean Algebras, for functions the definition DOES matter.

Originally Posted by wumpus
Look up "clock math" for the most basic introduction to finite fields (I'm not sure it really counts as a "field", reed-solomon and ecc were a bit much for me). Then realize that RSA works pretty much on the same principles, only for rather large "clocks".
I'd recommend looking up Finite Fields and Group and Ring Quotients directly. For anyone doing anything above high school mathematics, clock math can cause severe damage to the intuition behind the finite fields and rings produced by the quotients of Z. Specially since we often add the product, SPECIALLY IN CRYPTOGRAPHY where we REALLY want the field structure more than the group structure.

Fraleigh's "A first course in Abstract Algebra" is a good introduction to Groups, Rings and Fields. And "J.S. Milne's" "Notes on Group Theory" and "Fields and Galois Theory" are a great if very terse source to properly learning Finite Fields (which is available for free in his page). Learning Cryptography directly is much more painful from the omission of several intuition building results in my experience.

Originally Posted by crayzz
In the end, OR, AND, and NOT probably are better bases for boolean algebra: if nothing else, it's easier to build a physical OR gate (just let all the inputs go through the gate) than it is to build a XOR gate. OR is probably slightly easier to grasp at first, too. It's probably more practical to think of OR as a fundamental operation.
You are making a ... not exactly a mistake but a common problem for a lot of students when they first start delving into serious mathematics.

The definition of an object is not necessarily unique. There are often multiple equivalent definitions, most books will spend a good amount of time proving this equivalences immediately after giving their definition but not all do. But you can start from another definition and that's fine. Some definitions of an object are very good at giving you an intuition of WHY we want that object. Other definitions are really useful for calculations. Some definitions are easy to generalize. Some definitions look more natural if you come from a smaller object and have the property that their restriction matches another definition in spades.

Different authors, teachers and people pick different starting points. Any student should be prepared to assimilate the bulk of the definitions and have them readily available on their mental space to their convenience. The equivalences are often presented as theorems but in reality they are not results, they are just different constructions of the same object that holds the property.

It doesn't matter how you build Boolean Algebra, because it's the object itself that you care about. Each definition will have an advantage but that doesn't mean it's THE definition. In fact doing that is severely hampering yourself since often times you want to use the more "powerful" definition to prove something starting with the object but use the "weaker" definition to prove another object is actually your original object. A quick example is how with compact sets in finite dimensional linear metric vector spaces, the finite subcovering definition is often used when we want to prove something and start with a compact. While on the other hand if we want to prove a set is compact we use the definition of totally bounded and complete (or because it's metric and finite dimensional bounded and closed).

(Obviously because the definitions are equivalent there is no more powerful or weaker in proper logical sense but within your intuition of the object there is often a mental hierarchy of how useful certain constructions are).

For the inquisitive mind Category Theory puts all of the above in a far more stable ground, we simply stop caring about the building of the object and just about it's properties (to the point that the main problem for CT is proving an object even exists that fulfills what we ask of it). CT massively generalizes certain things that one sees poking around but can't really tell if it's a coincidence or a deeper underlying fact. Quotients by Kernels, Direct Sums and Order Reverting by Taking an Algebraic Structure being the gateway drugs.

The first normally pops up when one defines a group quotient (the quotient group has to be normal and this is equivalent to the group being the kernel of a homomorphism), and then just appears like the plague in Functional Analysis (Banach property is preserved by quotients with closed subspaces and the kernel is always closed). Direct Sums are a much more natural view of the product topology (inb4 Functional Analyst Pitchforks with the Pointwise Convergence Topology). Order Reversion by taking Algebraic Structures appears in Homotopy (Fundamental Group reverts the order when faced with projection) and Galois Theory (Bigger Field means smaller Galois Group so you can build intermediate fields by using intermediate subgroups).

13. ## Re: Boolean Algebra is Wrong

my god i'm halfway through my bachelor's in computer science and that hurt my brain. i guess i'm not cut out for theory. it's mainly low level programming that gets complicated like that, right? my brain is too small lol, not enough memory.

14. ## Re: Boolean Algebra is Wrong

Originally Posted by BannedInSchool
It's NOR gates all the way down.
You can build modern logic exclusively with NOR gates and get a fairly compact and optimal piece of hardware. You can also build it with NAND gates and get a similar result. I think XOR and NXOR can also be used to build all boolean logic, but with current technologies it is incredibly inefficient to do so. We had a colloquium last week about a new type of transistor that can be used to create XOR and majority gates easily and thus, despite being about twice the size of a modern MOSFET, could reduce total device size by a factor of 6. It did this by doing things that currently take 8 transistors and making them with 4 that were in a less complex arrangement.

At the end of the day, it is important to understand the core principles rather than having a heuristic that is normally right.

15. ## Re: Boolean Algebra is Wronskian

Originally Posted by AsteriskAmp
Fraleigh's "A first course in Abstract Algebra" is a good introduction to Groups, Rings and Fields. And "J.S. Milne's" "Notes on Group Theory" and "Fields and Galois Theory" are a great if very terse source to properly learning Finite Fields (which is available for free in his page). Learning Cryptography directly is much more painful from the omission of several intuition building results in my experience.
Speaking of book recommendations, I like to recommend Pinter's A Book of Abstract Algebra for several reasons. Pinter has a nice conversational style and covers a lot of intuition and practical reasoning, but the book still has more than the minimum rigour to use as an introductory university text.

Because of the style of writing, I find it's also nice to read as a refresher, which is how I first used it. It helped codify and reinforce my core knowledge of algebra before moving on to some graduate texts I wanted to work through.

Pinter is pitched at a low starting level of prerequisites, and teaches everything a bachelor of mathematics is expected to know about algebra. So, it can be recommended to just about anyone who is ready for university. Also, it has as its goal building the tools to cover Galois theory, which the last chapters focus on.

All in all, I find Pinter's A Book of Abstract Algebra to be a wonderful "bridge" text.

Oh, and it's a Dover book, so it's easily available and inexpensive.

16. ## Re: Boolean Algebra is Wrong

Be glad you're not a programmer and dealing with tri-value logic. The meaning of the third value is a horrid amalgamation of other meanings, such as unknown, missing data, or not applicable, and messes up all manner of logic tables and functions.

17. ## Re: Boolean Algebra is Wrong

Originally Posted by gomipile
Speaking of book recommendations, I like to recommend Pinter's A Book of Abstract Algebra for several reasons. Pinter has a nice conversational style and covers a lot of intuition and practical reasoning, but the book still has more than the minimum rigour to use as an introductory university text.

Because of the style of writing, I find it's also nice to read as a refresher, which is how I first used it. It helped codify and reinforce my core knowledge of algebra before moving on to some graduate texts I wanted to work through.

Pinter is pitched at a low starting level of prerequisites, and teaches everything a bachelor of mathematics is expected to know about algebra. So, it can be recommended to just about anyone who is ready for university. Also, it has as its goal building the tools to cover Galois theory, which the last chapters focus on.

All in all, I find Pinter's A Book of Abstract Algebra to be a wonderful "bridge" text.

Oh, and it's a Dover book, so it's easily available and inexpensive.
Pinter is a good main book for honors algebra in high school but not for proper undergraduate level algebra IMO. It's REALLY slow in some regards while in others it covers too little and introduces most of the material using exercises, it's far better as a side book to one of the heavier hitters. Though I think I misunderstood the target audience of your post, in which case disregard this paragraph.

Fraleigh is also skirting the line for an undergraduate course but it's much more defensible. My lecturer used Herstein but I found the Artin to be far better at building intuition, for refresher I just use Atiyah though.

Originally Posted by warmachine
Be glad you're not a programmer and dealing with tri-value logic. The meaning of the third value is a horrid amalgamation of other meanings, such as unknown, missing data, or not applicable, and messes up all manner of logic tables and functions.
Fuzzy Sets and Fuzzy Logic say hi (what do you mean ALL OF THE STATES IN BETWEEN TRUE AND FALSE?!).

18. ## Re: Boolean Algebra is Wrong

Originally Posted by AsteriskAmp
Pinter is a good main book for honors algebra in high school but not for proper undergraduate level algebra IMO. It's REALLY slow in some regards while in others it covers too little and introduces most of the material using exercises, it's far better as a side book to one of the heavier hitters. Though I think I misunderstood the target audience of your post, in which case disregard this paragraph.
/shrug

A lot of my university textbooks had key material in exercises only, and Pinter has much nicer prose than most of them. If one does all the exercises, Pinter covers all off what we did in undergrad algebra and more.

If I were teaching undergrad Abstract Algebra I now, I'd probably assign Pinter as the main text and have the library put some of the "heavier hitters" on reserve. Yes, it'd have to cover some of the text's exercises as lectures, but like I implied, many of my past professors had to go the same but without the benefit of a textbook so clearly written.

19. ## Re: Boolean Algebra is Wrong

Originally Posted by AsteriskAmp
Fuzzy Sets and Fuzzy Logic say hi (what do you mean ALL OF THE STATES IN BETWEEN TRUE AND FALSE?!).
It's...not really an even comparison. Fuzzy sets are used to indicate degree of membership, so while there are infinitely many possible values of membership, all those values exist on the same spectrum. Consequently, they all can be handled by the same, relatively concise procedure.

20. ## Re: Boolean Algebra is Wrong

Originally Posted by Grinner
It's...not really an even comparison. Fuzzy sets are used to indicate degree of membership, so while there are infinitely many possible values of membership, all those values exist on the same spectrum. Consequently, they all can be handled by the same, relatively concise procedure.
Just like tri-logic it depends. It depends on how the data is represented and what you want to do with it and so on. Tri-logic can get really obtuse when you have codes for what is really on that field (worse when the codes look like normal data but outside some boundary that the data shouldn't reach) and you get massive switch tables and whatnot. But fuzzy logic also depends on the application. You might have to deal with fuzzy belonging to multiple sets and have to determine ways to deal with tie braking when they are close to the middle and whatnot. Also if you use division and substraction floating point and the 0,1 cases can REALLY mess up your day even if formally the values should never go there.

Not saying it's a one to one comparison, it was half tongue in cheek. Both problems can end up making a mess of relatively straightforward code just because you have to deal with a ton of cases by hand. That on top of how things are getting represented, for each case.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•