PDA

View Full Version : This guy wants Clavius to review his site



JayUtah
2003-Jul-29, 12:49 PM
http://conspiracies.bounceme.net/

This guy sent e-mail to Clavius wanting to know what we thought of his site. Well, ladies and gentlemen, what do we think of his site?

kucharek
2003-Jul-29, 01:03 PM
Bounce him. Hard.

First of all, Ken Polsson (http://www.islandnet.com/~kpolsson/comphist/comp1969.htm) is surely happy about the first part. Funnily, I just saw Ken's page the first time a few days ago and immediadetly recognized it.

On the on-board computer:


THIS IS WHAT IT ACTUALLY HAD REVEALED BY A SOVIET SPY THRU CONFIDENCE
...
Disk: 512k floppy drive

Bruahahaha =D>




NASA DIDN'T HAVE THE TECHNOLOGY TO HAVE LIVE VIDEO AT THE DISTANCE TO THE MOON IN 1969 THEY COULDN'T EVEN STREAM VIDEO ON THE NET!!! NO WORKING SATELLITES TO CONNECT FOR VIDEO STREAMING!!!
Bruahahaha =D>
Bruahahaha =D>

Oh my god, stop it. Bruahahaha =D>

ROTFLMAO

If this guy thinks he is serious. Yeah. He has a serious problem.

Bruahahaha =D> Bruahahaha =D> Bruahahaha =D> Bruahahaha =D>

R.A.F.
2003-Jul-29, 01:45 PM
Well...the very first thing I noticed was this turkey-butt mis-quoting what is perhaps the MOST FAMOUS quote ever. If he can't even get that right...it certainly calls into question the veracity of his other claims.


OK the best example of why this landing is totally faked is hey Neil you up (profanity) on the moon hundreds of miles from Earth why not a full shot of the lunar lander and the Earth as a background shot. Why you not there!!! And a lunar lander shot
with the Earth in the background was beyond the editing abilities of Hollywood producers at 1969.

The Moon is ALOT CLOSER than it looks, only a few hundred miles!!!
No picture of the lander with the Earth in the background? That's just plain WRONG. Without going to the ALSJ, I recall one picture showing the back-side of the LM, looking almost straight up, and guess what...THERES THE EARTH.

Now I'm not trying to poke fun at a person whose primary language might not be english...but...my favorite part of the site is when he says, "Why you not there!!!" :)

kucharek
2003-Jul-29, 01:49 PM
No picture of the lander with the Earth in the background? That's just plain WRONG. Without going to the ALSJ, I recall one picture showing the back-side of the LM, looking almost straight up, and guess what...THERES THE EARTH.

AS11-40-5924 (http://www.hq.nasa.gov/alsj/a11/as11-40-5924.jpg).
As Eagle landed pretty close to the center of the visible moon disk, Earth was pretty close to the zenith, making it hard to spot.

captain swoop
2003-Jul-29, 01:51 PM
needs more salt

Donnie B.
2003-Jul-29, 01:55 PM
All your moon base are belong to us!

R.A.F.
2003-Jul-29, 01:59 PM
No picture of the lander with the Earth in the background? That's just plain WRONG. Without going to the ALSJ, I recall one picture showing the back-side of the LM, looking almost straight up, and guess what...THERES THE EARTH.

AS11-40-5924 (http://www.hq.nasa.gov/alsj/a11/as11-40-5924.jpg).
As Eagle landed pretty close to the center of the visible moon disk, Earth was pretty close to the zenith, making it hard to spot.

Thats the one I was referring to. I knew that someone else on the board would have it readily available. Thanks, Kucharek.

aporetic_r
2003-Jul-29, 02:20 PM
Is the "no picture of the lander with the Earth in the background" a new claim? I don't specifically recall seeing that one before. Other than that laughably weak argument, I don't see anything new on the whole site. It looks like he has been reading the big conspiracy sites and simply reprinted the claims he found most impressive. Throw in a dash of CAPS LOCK for legitimacy purposes, and that's it. Jay, when he asked you to review his site, he hadn't bothered to read your's, had he? Just out of curiosity, was he polite and serious in asking, or was it like one of the usual HB first-posts on here?

Aporetic
www.polisci.wisc.edu/~rdparrish

JayUtah
2003-Jul-29, 03:14 PM
The URL was sent to webmaster@clavius.org, but it refers to something I said on Usenet (where I give the above address as my contact info), so it's not a given he has read the Clavius site.

The note which accompanied the URL doesn't give any indication of being written by someone speaking English as a second language. It's a laughable run-on sentence, but the author uses English colloquialisms correctly, if not punctuation. I believe the poster may be fairly young and accustomed to using Internettish (non-)grammar.

Humphrey
2003-Jul-29, 04:31 PM
MY brain is shrinking. At least he knows hoiw to seperate paragraphs and change the font colour.

There are some new things i have never heard of before (mentionewd above). But Jay i do not think that you should waste your time making pages to devote to this person.

I think a few paragraphs would do much more than he needs.



He might be using you to get hits throught your site. If you link to his site and give him a debunking, people will visit it to see what you are talking about. Looking at the script for his site you see alot of javascript and cookies being made from his site. So he could be profiting by every visitor.

But that is just a guess, not verifiable.

Kaptain K
2003-Jul-29, 04:32 PM
... what do we think of his site?
A litle knowledge is a dangerous thing! :roll: Go get 'em Jay!

But, I think you may be ... ](*,)

JayUtah
2003-Jul-29, 04:39 PM
First of all lets start with actual facts NASA!

Keep this in mind while the author substitutes rumor and innuendo for what has been published by many sources.

Hollywood has much better backlighting techniques now.

The techniques of backlighting (i.e., illuminating a subject from behind in order to create an nimbus or outline) haven't changed since the late 1800s.

1969 Computer technology - " IBM's " THIS IS WHAT WAS AVAILABLE!

The odd citation format puzzled me, so I dug a bit and found

http://www.islandnet.com/~kpolsson/comphist/comp1969.htm

which is undoubtedly the author's source for this section. The reference format is a device particular to the original author.

It doesn't appear to be a publication of IBM, which our author alleges. The original title describes it as a chronology of desktop computers beginning in 1969, which makes it of limited utility when discussing special designs such as guidance systems. Of course digital guidance systems had been around since the late 1950s, but they evolved according to a fairly different family tree than personal consumer computers.

Just because the first landing happened in 1969 doesn't mean that's where you look for the state of the art. (Even if the author had looked at the right art.) You have to look at the entire development process through the 1960s.

INTEL ONLY HAD A 1 KILOBYTE CHIP IN 1969 FOR GOD SAKES 460 kb NASA?

This bit of incoherency is how the author sums up his plagiarized analysis of computer technology. This reflects the standard youngster's failure to understand how computers are made. There were computers before there were standardized integrated circuit chips, and there were IC-based CPUs before there were single-package CPUs.

Any electrical engineer can tell you how to wire up demultiplexers and chip select wires to create a fairly large address space out of nothing but 1-kilobyte chips, but that's fairly irrelevant because the memory in the AGC was not made from integrated circuits, nor is memory size a good indication of the capacity of an embedded system. In short, this author is pretty ignorant about computers. Nevertheless he continues.

THIS IS WHAT NASA CLAIMS WAS IN APOLLO 11 LUNAR LANDER IMPOSSIBLE AT THE TIME!!!
...
...a project manager of NASA's lunar lander will say later that is was a miracle that the lander ever came in the neighborhood of the moon. But knowing how software engineers are this could be taken with a "barrel" of salt.

Software engineering in the mid-1960s was almost completely unlike software engineering today. Today's software engineering schedules are driven by market forces and are orders of magnitude more demanding than software production 40 years ago. The digital autopilot segment of the software -- comprising but a few instructions -- was developed by a team of dozens over an entire year.

IBM delivered the guidance computer.

The author has confused the Saturn V guidance computer with the Apollo Guidance Computer. The Saturn V guidance computer was built by IBM and installed in the vehicle's Instrument Unit. The AGC was designed by MIT's instrumentation labs and built by Raytheon.

IBM is not a good benchmark during this period. IBM was a Johnny-come-lately to the IC world. They couldn't figure out how to make them. Their first attempt was a set of microminiaturized discrete components potted together on a very small circuit board, and this was the "integrated circuit" they offered to NASA for the Saturn V. At any rate IBM had never had the slickest or most efficient computers. Their sales were based more on reliability and good salesmanship. That's why the Saturn V computer was a dinosaur compared to the AGC.

All systems were replicated in triple. And the answer that came from the computers would be the "most" correct one. ; the so called "Majority Ruling". With this setup a small mistake can not cause a big accident.

There's no magic to voting logic. While it does tend to mitigate the effect of a component failure in the control system its best application is in dealing with "ratty" data. Even a perfectly good computer will make a mistake if given bad data. Data coming from external sensors will often contain "noise" momentarily, which can be mistaken for useful data. Because of the way computers "poll" these sensors, it's very hard for multiple computers to get the same ratty data.

Because in space any accident is fatal.

No. Space is less tolerant of error, but not completely intolerant. You can see what's happening: the author is trying to set us up to dismiss the singularity of the AGC by saying only redundancy can solve the problems of space travel.

THIS IS WHAT IT ACTUALLY HAD REVEALED BY A SOVIET SPY THRU CONFIDENCE

I don't understand why this author has to refer to an anonymous "Soviet spy" when the information he provides about the AGC is essentially correct and has been fairly common knowledge for decades. I suppose every author has to make some claim to "insider" knowledge, but in this case he's only shooting himself in the foot. The notable error is

Disk: 512k floppy drive

Of course there was no disk drive on the AGC, nor would one have been useful. The latency of a floppy disk would be far too great to use it as a RAM storage, and the susceptibility of the floppy disk to magnetic fields would make it unwise to use as a ROM storage. The RAM and ROM were provided by tried and tested magnetic core storage.

The real shot in the foot comes when you recall that IBM invented the floppy disk in 1971. They used it as a means to load microcode into their intelligent peripherals. Ironically the IBM 3084 my department purchased new in 1990 still loaded its microcode from an eight-inch floppy disk.

The author goes on -- rather amusingly -- to put this all in "layman's terms" for us.

A calculator would have had to weight about say 30 pounds without the advent of a micro processor or !@#$ integrated chip even 1 bit memory was only available at the time.

I'm not exactly sure what this word salad is supposed to convey, but apparently it's an argument that a meaningful computer could not be build within the constraints imposed on the AGC.

So the ** about is that the computer Apollo 11 couldn't of had suffiencient computer power to calculate entry onto the moons surface.

Two obvious mistakes here, one logical and one factual.

The logical mistake is that the author has attempted to measure the AGC's apparent capacity against a specific problem. Only he hasn't characterized the problem. It's pure handwaving: the reader is supposed to understanding intuitively that the guidance problem is a very heinous problem requiring much computing power. And so the author begs the question.

The factual mistake is that the AGC is not the only computer whose power was brought to bear on the problem of guidance. While the AGC could keep track of the state vector, maintain the spacecraft attitude, and perform all the functions of an embedded controller, the real muscle of the Apollo guidance was the impressive collection of mainframes at Mission Control.

Now the pictures baby! * Courtesy of Nasa.gov un-edited *

Image analysis on low-quality JPEGs is rarely fruitful. NASA provides digital copies of Apollo photographs for reference and education, not for research. (The LPI repository even reminds the reader of this.) Bona fide image analysts would know not to try to pick through JPEG pixels.

I will show you the movie to prove that the flag is waving in the movie at the end you can see it WAVE!! No atmosphere on the moon this is proof of my theory!

Yawn.

not to mention look at the 2 light sources !!! shadows are !@#$ up.

The author doesn't mention in what way he believes the shadows in this film are !@#$ up, and I'm not about to guess. I have a good idea, but I'll wait for him to make it plain.

I want to show you something regarding the boot tracks around the flag the patterns are not consistent with normal walking hobbling or any movement that's natural for a human

Hm, I wouldn't expect "normal walking hobbling" in conditions of diminished gravity. "Natural for a human" would presume normal conditions of atmosphere and gravity. What I notice, especially in the foreground footprints, is a pattern of locomotion consistent with the side-by-side kangaroo hops that the Apollo 11 crew sometimes used.

Ok the best example of why this landing is totally faked is Hey Neil your up [expletive] on the moon hundreds of miles from Earth why not a full shot of the lunar lander and the earth as a background shot.

AS11-40-5923
AS11-40-5924

I don't understand the author here at all. He provides one picture and asks why there isn't some other irrelevant picture? And it turns out there is. You can't get much less coherent than that.

And a lunar lander shot with the Earth in the background was beyond the editing abilities of Hollywood producers at 1969.

Codswollop. Any photographer in the 1960s could have created such an effect in-camera. Composite shots of the lunar landscape, spacecraft, and the earth in the background appeared in Kubrick's 2001.

NASA DIDN'T HAVE THE TECHNOLOGY TO HAVE LIVE VIDEO AT THE DISTANCE TO THE MOON IN 1969 THEY COULDN'T EVEN STREAM VIDEO ON THE NET!!!

In the abstract, there's no difference between transmitting television between you and your next door neighbor, and transmitting it between the moon and the earth. The question is simply signal strength and bandwidth.

The comparison with streaming video today is not as absurd as it seemed. The limitation in streaming video today is the slow links (not at NASA's end but at the consumer's end) and the limited bandwidth. It is not NASA's fault that high-quality video cannot be crammed through the average dial-up user's connection.

But NASA did have to deal with limitations in bandwidth. The Apollo 11 television signal had to share the Unified S-Band link with the voice and telemetry traffic, over the small one-meter dish on the LM. Thus NASA very ingeniously devised the means to transmit meaningful television data -- albeit not of very high quality -- through such a "narrow pipe". NASA, in this respect, was a pioneer in low-bandwidth video transmission.

NO WORKING SATELLITES TO CONNECT FOR VIDEO STREAMING!!!

One does not need satellites in order to "stream" video from the moon.

Glom
2003-Jul-29, 04:40 PM
At least he's not English. I've had enough with the UK's contribution to conspiracism of late.

ToSeek
2003-Jul-29, 05:07 PM
INTEL ONLY HAD A 1 KILOBYTE CHIP IN 1969 FOR GOD SAKES 460 kb NASA?

This bit of incoherency is how the author sums up his plagiarized analysis of computer technology. This reflects the standard youngster's failure to understand how computers are made. There were computers before there were standardized integrated circuit chips, and there were IC-based CPUs before there were single-package CPUs.


Not to mention that he doesn't know the difference between kilobits and kilobytes, going back and forth between the two at will when there's a factor of eight difference.

Humphrey
2003-Jul-29, 05:11 PM
INTEL ONLY HAD A 1 KILOBYTE CHIP IN 1969 FOR GOD SAKES 460 kb NASA?

This bit of incoherency is how the author sums up his plagiarized analysis of computer technology. This reflects the standard youngster's failure to understand how computers are made. There were computers before there were standardized integrated circuit chips, and there were IC-based CPUs before there were single-package CPUs.


Not to mention that he doesn't know the difference between kilobits and kilobytes, going back and forth between the two at will when there's a factor of eight difference.

Can you describe the difference for us?

And Jay: Could you explain this paragraph a little more? I do not understand what you mean.

Thanks to both. :-)

JayUtah
2003-Jul-29, 05:29 PM
A kilobit is 1,000 (or frequently 1,024) binary bits where a bit (Binary digIT) is the ability to store or transmit a "1" or "0" condition. A kilobyte is 1,024 bytes, where a byte is 8 bits grouped together.

The former measurement (abbreviated Kb) is typically used in communications technology as a measurement of transmission capacity. For example, a single consumer satellite transponder has a capacity of around 30 Mb (30 million bits) per second. The data is typically not grouped into bytes, as is common inside a computer, and so a measurement of "bitrate" is more appopriate.

The latter measurement (abbreviated KB; note capitalization) is typically used in data storage and computer technology where memory space is typically divided into 8-bit bytes that are atomically addressable (i.e., the smallest addressable unit of storage). This is actually fairly meaningless in Apollo technology because the smallest addressable unit of storage in the AGC was the "word" composed of 15 bits.

Many early computers used different word sizes. Word sizes were not standardized to multiples of 8 bits until long after the AGC design had been finalized.

Doodler
2003-Jul-29, 05:29 PM
This site read to me like the rantings of an 13-17 year old with an attitude. Not sure what a good thrashing, however well deserved, would accomplish. Might be worth a point by point dissection, he may not understand or accept it now, but give him 5 or 10 years he may mature enough to accept it.

Humphrey
2003-Jul-29, 05:34 PM
Thnaks Jay!!! :-)


One more thing: how about this?



There were computers before there were standardized integrated circuit chips, and there were IC-based CPUs before there were single-package CPUs.



Could you explain this a little more too?


Sorry for the questions. I am just curious.

Thanks.

Ducky
2003-Jul-29, 05:36 PM
INTEL ONLY HAD A 1 KILOBYTE CHIP IN 1969 FOR GOD SAKES 460 kb NASA?

This bit of incoherency is how the author sums up his plagiarized analysis of computer technology. This reflects the standard youngster's failure to understand how computers are made. There were computers before there were standardized integrated circuit chips, and there were IC-based CPUs before there were single-package CPUs.


Not to mention that he doesn't know the difference between kilobits and kilobytes, going back and forth between the two at will when there's a factor of eight difference.

Can you describe the difference for us?

And Jay: Could you explain this paragraph a little more? I do not understand what you mean.

Thanks to both. :-)

I know I'm not Jay, but I'm going to try to dust out the brain and take a shot at this.

In memory, you have the nibble, the bit, and the byte.

1 is a nibble. (binary 1)
11 is a bit. (binary 3)
1111 is a byte. (binary 15)

I'm just using ones because I'm lazy.

With "kilo" being a thousand, a kilo bit would be 1000 bits, while a kilobyte is 1000 bytes. Big difference.

Ducky

JayUtah
2003-Jul-29, 05:53 PM
I'm not sure what paragraph Humphrey wanted me to clarify, so I'll clarify the other one too.

Today we build computers using CPUs that have a tremendous degree of functionality in one "package", the "package" being a single indivisible electronic component. This component contains one or more "integrated circuits" that are designed and built as a single unit.

If you consider a basic logic circuit, say a two-input NOR-gate, you can build one with four transistors or vacuum tubes, a few resistors, and some wire. That circuit, and a few others like it, are the building blocks for a "sequential" logic circuit that can progress from one state to another over time. These in turn are building blocks for complex sequential systems that become computers.

The breakthrough was the ability to manufacture, in one process and as a single physical component, a NOR or NAND gate, or any of the other basic "logic"-level circuits that digital logic designers use at their most comfortable level of abstraction. That's an "integrated" circuit. From there it's a natural step to manufacture -- again in a single process -- a prefabricated version of common combinations of these building blocks to make more and more complicated "basic" circuits.

These days the "basic" circuit provided in integrated circuit packages are obscenely complex. I can wander down the hallway to our manufacturing area and see the guys putting these "basic" circuits together to form some of the most powerful computers in the world.

Before that, some of the functions of these complex circuits had to be peformed by additional circuits. Early CPU chips could only work on whole-number data -- not fractions. You either had to write programs to essentially have the CPU do long-division, or provide a separate FPU chip that would do the fraction computations in a circuit.

Before that, even the whole-number CPUs didn't come as a single chip. You had a chip that would decode the instructions, a chip to do the arithmetic, a chip to talk to the memory banks, and so forth. These were put together on a circuit board to link them together, and that was the "CPU board".

Before that, the basic logic operations of each of those processor components had to be built as chips. You used NAND and NOR chips (which, BTW, you can still buy at Radio Shack) to build the instruction decoder circuit, and other NAND and NOR chips to build the arithmetic unit -- the thing that knows how to add (and maybe subtract).

Before that, you build the individual logic elements out of transistors and resistors, and before that out of coils and vacuum tubes.

Unfortunately most laymen got interested in computers right around the time when entire CPUs were appearing in one IC package, so that's about when the collective layman's understanding of computers begins. This is what makes it hard for them to understand the AGC design and construction. It used integrated circuits, but at the "combinatorial" or "gate" level (i.e., NOR gates). Although sequential circuits in integrated form appeared during AGC development, they were not widely used in it. The design had been "frozen" at a certain point and limited to NOR circuits.

When I was going to engineering school, we were required to be able to design simple computers (even less capacious and flexible than the AGC) under similar constraints of "available" chip packages. Somewhere in my notes I still have my design for the AL201 specification. Sadly we were required only to design it; we were not given the opportunity to actually build it. Although my design passed, one of these days I'm actually going to go out and buy a bunch of cheap NOR chips at Radio Shack and see if I can actually build the thing.

The point is, the author's understanding of computer design and construction is a layman's understanding, and fairly ignorant of how computers -- especially special-purpose computers like the AGC -- were built before 1970.

JayUtah
2003-Jul-29, 05:57 PM
1 is a nibble. (binary 1)
11 is a bit. (binary 3)
1111 is a byte. (binary 15)


No, sorry. A bit is absolutely indivisible. It's either "on" or "off", charged or discharged, set or reset. A light bulb is a good analogy, it's either on or off. (The first person to mention tri-state logic gets wonked on the head. :-) ) That's a bit.

A "byte" is as I already described. Think of it as eight light bulbs in a row. A "nybble" (note creative spelling) is half of a byte, or four bits. Nobody uses that anymore; it was basically tied to a certain Intel chip that had to move a byte in two trips.

Glom
2003-Jul-29, 06:12 PM
The first person to mention tri-state logic gets wonked on the head. :-)

Err... so how would you do that?

R.A.F.
2003-Jul-29, 06:21 PM
I just visited this WooWoo's forum and...HEY...Wait a minute...How did he know that my Mom works for the CIA flying Black Helicopters out of area 51??? :D :D :D :D


You know, I like that phrase, I think I'll use it for my signature over at Apollohoax. :D

JayUtah
2003-Jul-29, 06:24 PM
Err... so how would you do that?

I didn't say I would do it.

Ducky
2003-Jul-29, 07:03 PM
1 is a nibble. (binary 1)
11 is a bit. (binary 3)
1111 is a byte. (binary 15)


No, sorry. A bit is absolutely indivisible. It's either "on" or "off", charged or discharged, set or reset. A light bulb is a good analogy, it's either on or off. (The first person to mention tri-state logic gets wonked on the head. :-) ) That's a bit.

A "byte" is as I already described. Think of it as eight light bulbs in a row. A "nybble" (note creative spelling) is half of a byte, or four bits. Nobody uses that anymore; it was basically tied to a certain Intel chip that had to move a byte in two trips.

Okay, so I had a moment of brain flatulence. It's been 12 years since I dusted off that section of the mind, and since then I've gone through a windshield. <shrug>.

Humphrey
2003-Jul-29, 07:06 PM
That was it Jay, thanks alot. :-)


Let me see if i got your post right:
So the computer consisted of one chip doing a certain funtion and only that one function right? Like one chip to add, another to subtract, etc. Is this correct?

The chips will then be strung together with some sort of software that could link the different chips together in the proper order.





So what could the command module computer actually do? What were its limits? Was it only basic telemetry and calculations or could it do some minor manuvering too?

jesus_christ_hacker
2003-Jul-29, 07:35 PM
http://conspiracies.bounceme.net/

This guy sent e-mail to Clavius wanting to know what we thought of his site. Well, ladies and gentlemen, what do we think of his site?

:^o Ok so I plagerized a bit for a shameless self promotion for hits I corrected it all and now its ALL MY WORDS!! YES ALL GRAMADICALLY * lol * mispelt * ha ha challenged that it is. I guess if I was a real nerd about this I could sit down at the local library and get my facts from Books. I chose the internet to get my info sorry to say I plagerize. I also steal code and better it. Thats what I do . As well I like to use Frontpage * Faster * and you can see what spelling mistakes I have how my html would look lol or php for god sakes. I appologize to everyone I offended and I am a new man now. I am currently looking at my website and I dont see these pages your talking about do they really exist or ever existed? Thanks for helping my prove my point though ! with this incredible research you have done!. :)
I will post a link to your website and friends website on my front page for refernce links ! woo hooo more shameless promotion even for you!

So basically I think that your site is a conspiracy because you have no evidence that I ever posted such dribble. I am fairly well educated even by Canadian standards but I have a tendancy to get " Excited " and " Emotional " about the way I feel on the Internet about certain things. I have to learn to collect my thoughts and only convey my most important ones. I place little effort on selling anything on my site however I see you dont convey the same view as me. [-X Oh well. Hope your guys shirt sales go up. I leave you with my quote by Jesus_Christ_Hacker " For every action, there is a equal and opposite dis-satisfaction "

Doodler
2003-Jul-29, 08:03 PM
Ok, maybe I'm a little slow off the starting mark, but was there a reason to post all that material on your site in the first place if you knew it would be ripped apart by a serious review?

Humphrey
2003-Jul-29, 08:27 PM
Ok, maybe I'm a little slow off the starting mark, but was there a reason to post all that material on your site in the first place if you knew it would be ripped apart by a serious review?

I am thinking he did it for money. There are several new add agencies that use hits and hit stats for thier own needs. They will pay you for a certain number of hits to your site. Like say for every 1000 hits you get $5 or something like that. MY brother does this on his site (why i don't go to it often). All they require is to add code for certain cookies.

This is only a guess, but he has a un-godly amount of cookies attached to his site. Just look at the source code.




hmmmm.... he wrote this on his site:
"Additional information: http://www.badastronomy.com ( I recommend taking ESL even if you are in your late 20's and graduated from college B4 visiting this site. They're sticklers for bad grammar and people that type quickly shortening actual words with " Internetism's "

He also deleted all of the information except the photograph stuff. Even then it is still wrong.

R.A.F.
2003-Jul-29, 08:37 PM
I even plagerized in college.

Let me guess...you drank your way through College, didn't you?

Sorry...I don't usually post persoal attacks like this...I just get the feeling that JCH will take it as a compliment. :)

JayUtah
2003-Jul-29, 08:39 PM
It's not quite that concrete.

A computer is a complicated electronic circuit whether it's made from integrated circuits or from transistors and wire. You can make the equivalent circuit using each of those technologies, but a Pentium made of transistors you buy at Radio Shack would be the size of a couple of rooms. Integrated circuits let you buy common circuits in convenient packages. Throughout time, what constitutes a "common" circuit has differed.

You can make a simple circuit with a battery, a switch, and a light bulb. You close the switch, the light goes on. You can make a "logic" circuit with two switches, a battery, and a light bulb. If you wire them one way, you turn the light bulb on by closing either one of the switches. If you wire it up a different way you can make it so that you have to close both switches for the light to turn on. It's called a "logic" circuit because it behaves like a logical OR operator (either this switch OR that switch), or a logical AND operator (both this switch AND that switch).

If you connect basic circuits like that together in different ways -- with the outputs (light bulbs) of one circuit becoming the inputs (switches) of another circuit -- you can create a circuit that can make simple decisions. If the "switches" are actually sensors (say, that tell when an elevator arrives at a certain floor) and the "light bulbs" are commands to machines (say, to tell a door to open), then you can control the behavior of a system. For example, you can build a circuit that says, "Open the elevator door when the elevator has arrived at a certain floor and the lift motor has stopped." Stated more formally, the output (open the door) is a function of one input (the elevator has arrived) AND an another input (the lift motor has stopped).

These systems aren't programmable. That is, they don't have software, and their behavior is determined forever by the way they're wired. To change the behavior of the system, you have to change the wires. Many control systems are built just that way. In fact, a lot of the Apollo spacecraft controls were built that way. My screen wallpaper is a portion of the Block I command module circuit diagrams for the SCS and guidance system. On it are many such simple logic circuits that aren't "programmed".

A computer is a logic circuit so general that it can behave in a very large number of ways. The input values to the circuit determine which of several possible behaviors it selects to do, and even which inputs (commands or instructions) to pay attention to next. That stored set of inputs to the circuit is the "program" or software in a system. The joy of this is that you can change the system's response to various external inputs (like sensors) by changing the stored instructions. You don't have to rewire it.

This means that you can manufacture that circuit as one unit. You can adapt it to different tasks by changing the stored instructions. You don't have to rewire it.

Now in the Olden Days (1965) it was a pretty good trick just to get six transistors and a few resistors onto one integrated or "printed" circuit. And that would be enough to do an OR or AND circuit -- you know, the two switches and the light bulb. So if you wanted to build an elevator controller with those chips, you had to connect a bagful of OR and AND chips together in the right way to recreate the circuit you built before out of individual components and wire.

Now there are common ways of connecting those basic circuits together to form other useful (yet not programmable) building blocks. A NOR circuit is a simple logic circuit like the ones above. Don't worry about how to wire it up; the light in this circuit is on until you close one of the switches. Now there's a way to take two NOR circuits and combine them into something called an S-R flip-flop. Don't worry about what that is, or what it does. Just believe that it's an important circuit.

So as integrated circuit manufacturers got better at putting more and more components into their chips, they thought, "Wouldn't it be great if we could give people an S-R flip-flop on a single chip? They wouldn't have to spend the time building it and worrying about whether they got it right, and they could get more flip-flops into their designs because it would all be there in one chip.

And so that's what they did. And electronic designers combine flip-flops into common arrangements to do certain tasks like drive digital displays for the Apollo DSKY, and so chip manufacturers -- once they could figure out how to fit all the necessary parts onto the integrated circuit -- started making chips to do that kind of thing. Pretty soon you get integrated logic chips that are so generalized they can start responding to stored inputs like computers. And so they are computers.

The circuits don't change. What changes is the ability of electrical engineers and manufacturing engineers to cram more and more commonly-used circuits together on a single chip.

The folks at MIT knew how to build these complicated circuits. They were still building them out of the constituent parts, though. So if they had designed a computer that needed 300 S-R flip-flops, and each flip-flop needed two NOR circuits, and each NOR circuit needed four transistors and two resistors, someone had to go out and buy all the transistors and resistors and spools of wire and build those NOR circuits, which could then be wired together to form SR flip-flops, which could then be wired together to form whatever the engineer was designing.

So when they discovered that someone had figured out how to make a NOR circuit in a sealed chunk of plastic that was relatively cheap and very small, they were excited. It would take thousands of those NOR circuits to build a circuit complicated enough to be considered "general" and "programmable". But it was still a lot better than having to tediously build all those NOR circuits out of their constituent transistors and resistors. And more importantly, it was less delicate, smaller, and lighter.

The AGC used thousands of NOR circuits attached to a custom circuit board. Those boards, put together, connected the circuits in a way that created a working processor -- that "general" circuit whose behavior could be dictated by stored sets of inputs. Those inputs were stored on a ROM "rope", yet another circuit that would supply the processor with a sequence of those inputs when requested.

Let me walk you through a simplified cycle of the digital autopilot -- one of the many tasks of the AGC. The DAP simply wants to keep the spacecraft pointed in the same direction. Let's say an astronaut pushes off from the side of the spacecraft to float across the cabin, and this causes the spacecraft to begin rotating slightly.

The first indication of this (from the computer's point of view) is a change in the gimbal angles. This is the angle formed by the guidance platform (a gyroscope) with its mounting rings attached to the spacecraft itself. Little sensors on the hinges report displacement, which is converted to a number by circuits outside the computer. The computer sees only the number. Several times a second the computer circuit brings that number into its work area and compares it with a number that's already stored in another part of the computer. That number is the expected gimbal angle.

The comparison is done with a "comparator" circuit. Think of it as a circuit with two rows of switches and a light bulb. The light bulb is off only when each pair of switches is either both on or both off. If any of the switch pairs disagree, the light goes on. You can build a circuit to do just that using NOR circuits as a building block.

If the comparator signals a mismatch, then that "light bulb" is the signal to being paying attention to a particular set of stored instructions. Those instructions tell the computer circuit to give that pair of numbers (the measured gimbal angle and the expected gimbal angle) to another circuit that subtracts one number from the other. Think of this as a circuit with two rows of switches and a row of light bulbs (instead of just one). This circuit is wired so that if you consider the switches to be binary digits representing the nubers, the light bulbs will be the binary digits representing the arithmetic difference between them.

Let's say the measured gimbal angle is 6, or 110 in binary. And the expected gimbal angle is 4, or 100 in binary. A "1" indicates a closed switch and a "0" indicates an open switch. If you set your switches correctly on this circuit, the light bulbs will show you the difference of 2, or 010 in binary. The middle light bulb will be on ("1") and the others will be off ("0").

Conceptually this represents the amount by which the spacecraft is pointed wrong. You want to correct this. You know that in order to do this you have to turn on a thruster to rotate the spacecraft back into the right orientation, and then the opposite thruster to stop that rotation once you have arrived. The amount by which you're "off" is given by that subtraction you did earlier. So that tells you how much time you have to wait between thruster firings.

Now the thrusters are very powerful. So we only have to "pulse" them -- turn them on and then immediately off again. The RCS controller does this for us. The next stored instruction says to "pulse" one of the thrusters by turning on a computer output line that's connected to the RCS input. The RCS controller has circuits that handle turning on the thruster and then turning it off again quickly.

So having "pulsed" the thruster, we have to wait an appropriate amount of time. One way to do that is to take our subtracted number, 2 (010), or the "attitude error" in guidance talk, and count backwards at a constant rate from that number. To do that, we put the number into yet another circuit that subtracts 1. (If you're clever, you can use the same circuit you did before to subtract the gimbal angle and expected angle.) Then you use your comparator to see if that number is equal to zero. If the comparator's output is "no, they're not equal" then we go back and do that set of instructions again -- we subtract one from the attitude error (which is now 1 since we subtracted 1 from 2 in the previous step). Then it goes back into the comparator to see if it's zero.

This time it is, so we stop repeating that sequence of stored input instructions and move on to the next step, which is to "pulse" the opposite thruster to stop our corrective rotation. We do that by turning on another computer output line that tells the RCS to pulse the thruster.

I hope this helps. The AGC was in fact a fairly sophisticated computer. It wasn't merely a digital autopilot. In terms of the "richness" of its ability to be affected by stored inputs (programs) it was about as sophisticated as the microprocessors just before the ones used in the Apple II personal computer.

jesus_christ_hacker
2003-Jul-29, 08:46 PM
All your doing is bolstering my theory that the computer on the lunar lander was less than a typical calculator today and couldn't calculate a landing trajectory. Lets face it and the live video signals from the moons surface was technologically impossible at the time. You couldn't send a live feed streaming to earth with the radio signals if you did it with the same signal frequency they used in 1969 even now we could only still frame even with our compression technology. Run on sentence whips saliva off face. grrrrrrrr

Jim
2003-Jul-29, 08:51 PM
Ok the best example of why this landing is totally faked is Hey Neil your up [expletive] on the moon hundreds of miles from Earth why not a full shot of the lunar lander and the earth as a background shot.

I don't understand the author here at all. He provides one picture and asks why there isn't some other irrelevant picture? And it turns out there is. You can't get much less coherent than that.

I think what he means is, here's arguably the Most Important Single Photo In The History Of the World. So, why didn't they frame it to show the earth in the background?

Of course, if the MISPITHOW did show the earth in the background, the argument would be that such a shot is so difficult to set up that it must be faked.

jesus_christ_hacker
2003-Jul-29, 08:56 PM
You can talk to me **** tard look at the Id check the website I am here now. ******* get off the ****ing grammer I told you I dont give a **** about spelling. Anyway its a example picture its not the be all and end all you bad words deleted.[/url]

JayUtah
2003-Jul-29, 08:59 PM
I guess if I was a real nerd about this I could sit down at the local library and get my facts from Books.

You should. You're talking here to a large group of real nerds.

I appologize to everyone I offended and I am a new man now.

The plagiarism isn't important to me, at least in terms of how it might have affected someone else. The danger of plagiarism is that you let other people do the thinking for you. That means you might not necessarily understand what that original author is saying.

Thanks for helping my prove my point though ! with this incredible research you have done!. :)

If your point is that the moon landings were faked, we didn't prove your point. We showed that your points are wrong.

Many of us are experts in computers, especially the kinds of computers used to control spacecraft. That's not something that a lot of people are experts in. And so it's not something that just everyone understands. You can't necessarily compare one type of computer with another.

I am fairly well educated even by Canadian standards ...

That's not apparent at your site.

Even well-educated people don't understand the nuances of guidance computers because that's specialized information. It's not hard to understand; it's just that most people don't care to know about it. But if you're trying to tell people that the AGC couldn't have worked, you have to know more about it than just what the average layman knows. The way you talk about computers leads us to believe you don't know much about them.

Well-educated people have generally heard of inertia, which is why the flag bobs and bounces after the astronauts adjust it. Because the aluminum framework is "springy" this bouncing can last for several seconds after the astronauts let go, like a diving board continuing to bounce after the diver has long since entered the water.

In short, you don't seem to know a lot about the physical world. You don't seem to know a lot about Apollo history. You don't seem to know a lot about the technical details of what you're including in your page. This is fine because many people don't care to know those things. But when you publish writings on the subject, people naturally think you know what you're talking about. This creates the potential to misinform people. And some of us feel very strongly that people shouldn't be misinformed if it can be helped.

jesus_christ_hacker
2003-Jul-29, 09:03 PM
Are you trying to tell me that a calculator can land a spacecraft on the moon j@ckass? Or that you can send live images from the moon in 1969 with 2 cups and a string?

Humphrey
2003-Jul-29, 09:05 PM
Yes Jay that helped a ton. Thank you very much. I actually understand it now.


You really should look into a full time professorship. You are really good at it. Heck if you can make a non-engineer understand the basics just by typing out a post, I am sure you can do wonders in the classroom.

It might not pay nearly as much as you get paid now, but if i guess riught, you will enjoy it immensley.

Again, thank you. I had always wondered how they made computers work before software. Now i know. :-)

JayUtah
2003-Jul-29, 09:07 PM
All your doing is bolstering my theory that the computer on the lunar lander was less than a typical calculator today and couldn't calculate a landing trajectory.

Not at all. You seem to believe the AGC was solely responsible for navigating the ship. It wasn't. There was a whole roomful of mainframes at Mission Control that monitored the trajectory and sent up commands to the spacecraft that said things like "turn right a little bit".

You also don't seem to understand what's involved in computing a ballistic trajectory. Forgive the pun, but it's not rocket science. Simply maintaining a state vector is well within the capacity of a pocket calculator, and that's the most important thing about guidance.

If you were able to convince us that you knew what kind of computational task it was to navigate to the moon, your opinion that the AGC wasn't up to the task would mean something.

Lets face it and the live video signals from the moons surface was technologically impossible at the time.

I don't have to "face" any such thing. I'm a professional engineer. Anytime, anyplace I can stand up and give you a two-hour lecture about television transmissions in space. You're just standing there and telling me it couldn't be done, and to substantiate your opinion you're making all kinds of irrelevant comparisons.

You couldn't send a live feed streaming to earth with the radio signals if you did it with the same signal frequency they used in 1969

Can you provide a signal-processing proof for this statement? We have a lot of combined experience in S-band transmissions. You should be able to use the standard references to support your point.

even now we could only still frame even with our compression technology.

What makes you think compressing an analog signal is anything like compressing today's digital signals?

Doodler
2003-Jul-29, 09:07 PM
Now there are common ways of connecting those basic circuits together to form other useful (yet not programmable) building blocks. A NOR circuit is a simple logic circuit like the ones above. Don't worry about how to wire it up; the light in this circuit is on until you close one of the switches. Now there's a way to take two NOR circuits and combine them into something called an S-R flip-flop. Don't worry about what that is, or what it does. Just believe that it's an important circuit.



Just so I can get something useful out of this thread, is this the origin of the term "flop" in computing. e.g. The new supercomputer competitions based on the number of teraflops performed?

Added: :o Just read one of JCH's replies, given the pending lock-down of this thread, feel free to private message me a reply.

JayUtah
2003-Jul-29, 09:09 PM
Are you trying to tell me that a calculator can land a spacecraft on the moon [expletive]?

Profanity will get you banned from here. I'm telling you that the AGC was sufficient for the tasks assigned to it. You have a hard time believing that because you don't understand the capacity of the AGC in any meaningful way, and you don't know what exactly was assigned to it.

Or that you can send live images from the moon in 1969 with 2 cups and a string?

No. I'm telling you that if you reduce the frame rate by 2/3, the color information by 2/3, and the field information by 1/2, you have only about 6% of the original color television bandwidth requirement. That will fit easily into an S-band signal.

R.A.F.
2003-Jul-29, 09:10 PM
JCH...Profanity will get you BANNED. Maybe you should curse some more! :)

SeanF
2003-Jul-29, 09:17 PM
Just so I can get something useful out of this thread, is this the origin of the term "flop" in computing. e.g. The new supercomputer competitions based on the number of teraflops performed?


Hey, I can jump in with something useful here! Flops is an acronym for "Floating point operations per second.

:)

The Bad Astronomer
2003-Jul-29, 09:17 PM
Replying to JCH will be useless; I banned him. Although someday he may figure out the monumental number of mistakes he has made (though I doubt it), he won't be doing it here when he posts like that.

I wonder why someone would put together a page like his; it has almost nothing new, and the vast majority of his claims were debunked by Jay and me over a year ago. Weird.

Doodler
2003-Jul-29, 09:20 PM
Replying to JHC will be useless; I banned him. Although someday he may figure out the monumental number of mistakes he has made (though I doubt it), he won't be doing it here when he posts like that.

I wonder why someone would put together a page like his; it has almost nothing new, and the vast majority of his claims were debunked by Jay and me over a year ago. Weird.

I think Jay nailed it when he brought up trying to peg his hit counter for money. I had heard of that before, but didn't make the connection. Personally, I thought he might just be ought for a troll :(

Humphrey
2003-Jul-29, 09:30 PM
I think Jay nailed it when he brought up trying to peg his hit counter for money. I had heard of that before, but didn't make the connection. Personally, I thought he might just be ought for a troll :(

Ehhheeemm... (http://www.badastronomy.com/phpBB/viewtopic.php?p=122382#122382)

:D

ToSeek
2003-Jul-29, 09:32 PM
All your doing is bolstering my theory that the computer on the lunar lander was less than a typical calculator today and couldn't calculate a landing trajectory.

I actually wrote a lunar landing game on a Texas Instruments calculator when I was in college. I think I had all of 256 instructions (if that many) to do it with, too. It's not a big deal.

Doodler
2003-Jul-29, 09:37 PM
I think Jay nailed it when he brought up trying to peg his hit counter for money. I had heard of that before, but didn't make the connection. Personally, I thought he might just be ought for a troll :(

Ehhheeemm... (http://www.badastronomy.com/phpBB/viewtopic.php?p=122382#122382)

:D
ACK!! my apologies... sorry, brain still recovering from this website. I think my eyes actually crossed.

Humphrey
2003-Jul-29, 09:40 PM
lol, No prob. I have done the same mistake myself. :-)

Glom
2003-Jul-29, 09:42 PM
A whole lot of really intelligent stuff about computers.

Wow. Great stuff. There's a reason why we call Jay the King of Men. Thanks, Jay.

I've written a few programs on my calculator. One was a conversion program to convert imperial units to metric, mainly those given in the Godwin Anthologies. One was to calculate a vector cross product. One was a fancy cosine rule program designed to measure the non-relativistic straight line distance between stars given their ranges from Sol and angular seperation. One calculated the coefficients of the binomial theorem. But my proudest achievement has been a set of programs all designed to calculate the position and velocity vectors of a satellite given the original position and velocity vectors and the time elapsed.

Humphrey
2003-Jul-29, 09:47 PM
The only programs i have written on my own and not for class were two "You Don't Know Jack" programs with working multiple choice questions and fill in the blank, and a calculator program to exchange money from one form to another.

I know how to program (basic stuff in:) Basic, a good amount of xhtml and css, and a little pascal.

My website is entirely written by hand exceopt for a little javascript i origonally had.



But this is off topic.

-----------------



Was the computer the same for all apollo craft, or was it upgraded with advancing technology?

Glom
2003-Jul-29, 10:09 PM
BTW, forgive my ignorance, but what exactly is a transistor?

The Supreme Canuck
2003-Jul-29, 10:13 PM
It's an electronic switch. If there is current (the switch is closed) the computer sees it as a 1 in binary. If there is no current (the switch is open) the computer sees a 0.

Glom
2003-Jul-29, 10:16 PM
So it's another name for a switch?

JayUtah
2003-Jul-29, 10:16 PM
The "flop" in "flip-flop" is the colloquial meaning. A flip-flop is the basic building-block of sequential logic (logic that has states which change over time, as opposed to combinatoric logic whose states change only as the inputs change). They're called that because they flip and flop between two states. SR flip-flops are useful for "storing" a bit. The S refers to a "set" line and the "R" to a "reset" line. If you want to turn the flip-flop output on, you turn on the S line and the "clock" the circuit. If you want it turned off, you turn on the R line and "clock" the circuit. If you "clock" the circuit with neither S nor R turned on, the output is whatever it was before. If you "clock" the circuit with both S and R turned on, the seas boil and the moon turns to blood and the universe as you know it comes to a screaming, cataclysmic end. At least that's what they say. You're not supposed to have them both turned on at once. The results are "undefined".

The unit of computer speed derives from the acronym FLoating-point Operation Per Second. My company's current best is 7.634 teraflops. Technically it should be FLOPS and a FLOPS is singular, but in colloquial usage we singularize it. There's really no such thing as a "teraflop". The current (unclassified) record is 35.86 teraflops by the Earth Simulator (or as I call it, "SimPlanet").

This is a measurement more suited to scientific computing. It doesn't directly compare to measuring processor clock speeds, or measurements on [insert name of your favorite contrived benchmark here]. The standard benchmark for supercomputers is the LINPACK linear algebra problem.

The Supreme Canuck
2003-Jul-29, 10:24 PM
JayUtah: I can't even begin to understand that!

Glom: That was a very basic definition, but it gives you a general idea of what a transistor is.

Glom
2003-Jul-29, 10:25 PM
Thanks.

The Supreme Canuck
2003-Jul-29, 10:27 PM
Sure. If you want to know more, you can probably Google up a better definition than mine.

JayUtah
2003-Jul-29, 10:28 PM
Ah, the transistor.

Remember my analogy with the switches and the light bulbs? I made some vague, handwaving comment about connecting basic logic circuits so that the outputs (light bulb) of one were connected to the inputs (switch) of another? Someone was supposed to ask, "How the dickens do you connect a light bulb to a switch?"

The answer is with a transistor. The transistor has an "in" hole (where you connect the battery), an "out" hole (where you connect the light bulb), and a "switch" hole, where you apply voltage if you want the stuff waiting at the in-hole to be let through to the out-hole. If you wire the output of a circuit to a transistor, when the output is "on" then the transistor will let electricity through -- which is what a switch does. That way the output of one circuit can be the input of another circuit.

It works very similar to a relay. In a relay the input energizes an electromagnet which closes a bail that completes another circuit.

Hey, can you build logic circuits with relays? You surely can. Whirlwind was a computer built largely out of relays. In fact, tradition has it that the first computer "bug" was a moth that got caught in a relay and prevented its bail from closing. (It's a pretty well documented tradition.) And in the Olden Days elevator controllers (the classic example of embedded control systems) were built out of relays. Those of us with too much free time as kids snuck into the top floors of buildings to watch those relays clacking away.

Glom
2003-Jul-29, 10:39 PM
So how do they work?


Wait, I know this one... Magnets.

Is it something to do with semiconductors?

johnwitts
2003-Jul-29, 10:41 PM
But this is off topic.


What was the topic???

Getting back to the AGC...

There's an analogy here with automotive technology and engine management systems. Today's Electronic Control Systems can be programmed, updated (chipped), and sometimes even include control of things besides the engine such as Sat Nav etc. My car has two separate systems. One looks after the ignition while the other looks after the fuel injection. They 'talk' to each other, but they are two separate systems. My previous car had an electronic ignition control circuit, but a 'manual' carburetter. The ignition timing was done mechanically. The car previous to that had points for the ignition, basically a switch and a single wire to the coil.

All four of these examples result in the same thing. An engine which is under the control of the user and car usefully perform a task. The LM systems were a similar mix of mechanical, hardwired and software driven systems working in tandem.

One final point about computers and control. (Apologies to those who've heard this before...) Modern ECVT gearboxes have loads of electronic controls, electronically controlled clutches and sensors wired throughout. The CVT gearbox in my car has 2 switches. One tell you it's in neutral so the starter will engage, and one for the reverse light. All the other functions of gearchange, clutch, reverse etc are done using mechanical or hydraulic means. The thing works a treat, even after 90,000 miles. I wonder how long the ECVT models last..?

The Supreme Canuck
2003-Jul-29, 10:53 PM
Glom: These might help:

http://electronics.howstuffworks.com/diode.htm

http://electronics.howstuffworks.com/microprocessor.htm

Glom
2003-Jul-29, 11:01 PM
8) Great link. Thanks.

The Supreme Canuck
2003-Jul-29, 11:03 PM
Again, no problem. How Stuff Works is possibly one of the best sites on the web for this kind of stuff!

CincySpaceGeek
2003-Jul-29, 11:48 PM
http://conspiracies.bounceme.net/

This guy sent e-mail to Clavius wanting to know what we thought of his site. Well, ladies and gentlemen, what do we think of his site?

:roll: Putz...just...ugh.......putz

Please tell me my tax dollars didn't educate this guy.

BTW your electronics explanations were great! Were were you when I went to DeVry? :) Now if you can 'splain Karnaugh Maps to me with the same simplicity I'll name my first born after ya! :D

Musashi
2003-Jul-29, 11:49 PM
Please tell me my tax dollars didn't educate this guy.

Nope, he's Canadian. :)

The Supreme Canuck
2003-Jul-29, 11:50 PM
Please tell me my tax dollars didn't educate this guy.


They didn't. My tax dollars did.

*Sigh*

:(

Glom
2003-Jul-29, 11:51 PM
Yeah, rub it in Canuck's face! [-X

The Supreme Canuck
2003-Jul-29, 11:52 PM
You just wait until there's a guy like this from England here, and then BAM! It'll be your turn to be ridiculed! :P :wink:

Musashi
2003-Jul-29, 11:53 PM
It's called the BBC! :D

The Supreme Canuck
2003-Jul-29, 11:53 PM
Zing! :lol:

Glom
2003-Jul-29, 11:55 PM
We've had many many nutjobs from England over here. Anyone remember Santa? Is it even true that Nasascam guy is English?

The Supreme Canuck
2003-Jul-29, 11:56 PM
Santa was English? :o

You have my sympathy, friend.

Musashi
2003-Jul-29, 11:57 PM
I missed Santa and all the classic nuts. Were they anything like the more recent jobs (Bradgut, mr arriba)?

The Supreme Canuck
2003-Jul-29, 11:59 PM
Take a look. (http://www.badastronomy.com/phpBB/search.php?search_author=Santa)

R.A.F.
2003-Jul-29, 11:59 PM
Please tell me my tax dollars didn't educate this guy.


They didn't. My tax dollars did.

*Sigh*

:(

If I were you I'd demand a refund!!

Also did anyone happen to notice that he couldn't even spell the word...well, I can't say it here but it begins with a B and is defined as being born to an un-wed mother.

Musashi
2003-Jul-30, 12:03 AM
Oh, I remember some of Santa

The Supreme Canuck
2003-Jul-30, 12:08 AM
He was the first nut I saw. Missed everyone before him.

Musashi
2003-Jul-30, 12:13 AM
Yeah, he showed up right around the time that I did. I remember him now.

Vermonter
2003-Jul-30, 12:24 AM
I've seen the incarnations of Prince...he was pretty hardheaded. Was finally vanquished for good when I caught him using Princz or whatever his new handle was. The BA took care of him shortly after. :)

JayUtah
2003-Jul-30, 12:29 AM
Now if you can 'splain Karnaugh Maps to me with the same simplicity I'll name my first born after ya! :D

Yikes, that's reaching back into the old college fog.

A Karnaugh map is a tool for deriving the Boolean expression that embodies the desired set of outputs given the set of possible inputs. The inputs are arranged on the axes of a grid, and the desired output is the grid cell's entry.

The thing that bakes most people's noodle is the dimensionality. If you have two inputs and one output, you put each input on an axis. But what if you have four inputs? Drawing those hypercubical grids is a real pain in the butt, so that's why you double-up the inputs on the axes. If you have inputs A, B, C, and D you put AB on one axis (with entries 00, 01, 11, and 10) and CD on the other axis with the same entries.

The map is useful only if you're good at recognizing significant patterns in the ones and zeros. Some people are really good at that, and some people aren't. But that's why you mark the inputs in non-numerical order. It groups the 1x and 0x and x1 and x0 entries in adjacent rows or columns so you can draw circles around them and determine that the x can be factored out.

Often it comes down to whether you did well in algebra or geometry. You can reduce a Boolean expression by applying Boolean properties, and that tends to be what algebraists favor. You can reduce it using the Karnaugh map, and that's what geometrists favor. I'm a geometrist, so I prefer the maps. But I haven't worked a Karnaugh map in years. I'm just not in a position where I need to design logic circuits.

freddo
2003-Jul-30, 01:26 AM
I can't see anything when I stare at numbers - so boolean expressions in plain english help me infinitely...

CincySpaceGeek
2003-Jul-30, 02:13 AM
Now if you can 'splain Karnaugh Maps to me with the same simplicity I'll name my first born after ya! :D

Yikes, that's reaching back into the old college fog.

A Karnaugh map is a tool for deriving the Boolean expression that embodies the desired set of outputs given the set of possible inputs. The inputs are arranged on the axes of a grid, and the desired output is the grid cell's entry.

The thing that bakes most people's noodle is the dimensionality. If you have two inputs and one output, you put each input on an axis. But what if you have four inputs? Drawing those hypercubical grids is a real pain in the butt, so that's why you double-up the inputs on the axes. If you have inputs A, B, C, and D you put AB on one axis (with entries 00, 01, 11, and 10) and CD on the other axis with the same entries.

The map is useful only if you're good at recognizing significant patterns in the ones and zeros. Some people are really good at that, and some people aren't. But that's why you mark the inputs in non-numerical order. It groups the 1x and 0x and x1 and x0 entries in adjacent rows or columns so you can draw circles around them and determine that the x can be factored out.

Often it comes down to whether you did well in algebra or geometry. You can reduce a Boolean expression by applying Boolean properties, and that tends to be what algebraists favor. You can reduce it using the Karnaugh map, and that's what geometrists favor. I'm a geometrist, so I prefer the maps. But I haven't worked a Karnaugh map in years. I'm just not in a position where I need to design logic circuits.

=D> See!...that wasn't so bad was it? It's nice to remember the good ol' days once in a while. Actually when it came to K-maps, if it had more than a 4-bit input I was reduced to a blithering, drooling puddle of goo. [In my best Capt. Kirk cadence] I...JUST...didn't...GET IT! I fell into the catagory of not recognising patterns easily I guess. But then again the class was taught by a guy who swore 'til Tuesday that digital was just a passing fad, so go fig. #-o

Thanks for the try!!!

Rue
2003-Jul-30, 04:05 AM
Another fine thread in the LC it starts ugly but in the end the world is a better place. Fortunatly it was not a target of the BAs recent lockarama.

JayUtah, have you ever thought of teaching an online course?

JayUtah
2003-Jul-30, 12:56 PM
JayUtah, have you ever thought of teaching an online course?

I thought that's what I was doing. :-)

R.A.F.
2003-Jul-30, 02:12 PM
JayUtah, have you ever thought of teaching an online course?

I thought that's what I was doing. :-)

I agree, every time I read a JayUtah post I learn something new. (Yesterday's class was computers)

The only problem I have with Jay, and it's actually not a problem at all, is this....lets see...how should I word this...

You go through life, thinking that you know it all, having those around you say how intelligent you are, and then you read a JayUtah post and realise that you are not the smarty-pants you perceived yourself to be.

And that's quite alright. Learning is a wonderful adventure, and Jay freely shares his knowledge with all.

So what I'm saying, in a round-about way is, Thanks Jay, your continued presence here makes coming to this board a real treat.

ktesibios
2003-Jul-30, 03:39 PM
The "flop" in "flip-flop" is the colloquial meaning. A flip-flop is the basic building-block of sequential logic (logic that has states which change over time, as opposed to combinatoric logic whose states change only as the inputs change). They're called that because they flip and flop between two states. SR flip-flops are useful for "storing" a bit. The S refers to a "set" line and the "R" to a "reset" line. If you want to turn the flip-flop output on, you turn on the S line and the "clock" the circuit. If you want it turned off, you turn on the R line and "clock" the circuit. If you "clock" the circuit with neither S nor R turned on, the output is whatever it was before. If you "clock" the circuit with both S and R turned on, the seas boil and the moon turns to blood and the universe as you know it comes to a screaming, cataclysmic end. At least that's what they say. You're not supposed to have them both turned on at once. The results are "undefined".



Umm, Jay- that's more like a description of the way a J-K flip-flop behaves. S-R FFs are asynchronous- a "set" or "reset" input is transferred to the outputs immediately, not when a clock signal changes state.

Of course, many of the clocked F-F's (J-Ks and Ds) of my acquaintance also have asynchronous preset and clear inputs which allow them to behave like S-R FFs, which can come in handy at times...

BTW, I think I've noticed something interesting about the ecology of trolls. You know how some organisms that at first glance just seem "icky" turn out to be filling important and necessary niches, e.g. flies and their role in the recycling of organic material?

Trolls do something similar- they act as bait to get the BABB experts into the classroom. Troll the right kind of lure through the LC forum and before you know it, Jay's in front of the class and we're all getting some gratis learning.

Maybe we should organize a "designated troll of the month" pool to help keep the lessons going. :wink:

Doodler
2003-Jul-30, 04:05 PM
I wondered what happened to Prince, does anyone know what became of Agorabasta?

Donnie B.
2003-Jul-30, 09:10 PM
Aww, I got all busy and missed the fun in this thread! Well, it's been a good read anyway.


BTW, forgive my ignorance, but what exactly is a transistor?
Glom, I must confess, I was astonished when I read this. It seems incredible to me that someone of your obvious intelligence and level of education would be asking this. Maybe it's because I'm (probably) a good bit older than you -- perhaps you grew up after the era of "transistor radios" and all you've ever heard about is "chips".

Well, since this is my area of expertise, I feel the need to expand a bit on Jay's answer (which was perfectly good, as far as it went).

First, if you know about ICs (aka "chips"), the simple answer is that transistors are the active guts of practically every chip.

But looking at Jay's more complete answer, and placing myself in the shoes of an electronic neophyte, I would have had one big-time question. To wit: why would you need a transistor to switch an electrical signal, when you already have the electrical signal in the first place? That is, if you have to send the transistor a control signal to turn it on, why not just send that same signal to whatever it is the transistor output is connected to?

There are several things a transistor does besides pass a signal along, and they are all generally related to the concept of amplification. What a transistor does is allow you to control a large current (or voltage) using a small current (or voltage). Because of this fact, transistors can convert tiny signals (like the ones produced in a radio antenna) into large ones (like the kilowatts flowing through giant speaker stacks at an outdoor concert). [Clarification: it takes quite a few transistors to amplify that much!]

Also, in logic circuits, transitors allow you to create AND and OR gates simply, because they can take a small "sample" of two signals and combine them as required, then drive out a larger signal as the result -- without destroying or degrading the input signals. This is critical, and it's one reason transistors have been enormously successful while other technologies (such as superconducting junctions) have not.

Another advantage is that it's easy to make a transistor do signal inversion -- that is, turn its output OFF when its input is ON, and vice versa. That's one thing you can't do with passive devices (short of a relay).

Transistors are temperature sensitive, but they do their thing over a fairly broad range of temperatures. Even the least capable devices are rated for operation between 0 and 50 degrees C. No supercooling required! They come in a truly vast range of specifications and capabilities, from the tiny, fast, fragile devices inside a microprocessor to industrial-strength monsters capable of driving your local TV station's broadcast antenna. There are myriad "families" of transistor types. BJT, JFET, and MOSFET are the most common; MOSFETs probably outnumber all other types by millions to one, since they're easy to make into chips.

In short (as I climb off my soapbox), transistors are one of the most fundamental elements of modern technology. I wonder whether this is appreciated enough by those who design our school cirricula... :-?

Glom
2003-Jul-30, 09:43 PM
Glom, I must confess, I was astonished when I read this. It seems incredible to me that someone of your obvious intelligence and level of education would be asking this.

:oops: :oops:

That's the kind of thing we'd be taught in a Physics class but we did Edexcel physics, which is probably what David Percy did.

Most of what I learn comes from this board. Transistors had never been brought up before so I didn't know about them.

JayUtah
2003-Jul-30, 10:39 PM
Umm, Jay- that's more like a description of the way a J-K flip-flop behaves. S-R FFs are asynchronous- a "set" or "reset" input is transferred to the outputs immediately, not when a clock signal changes state.

Ah, yes. That would explain the conspicuous absence of a clock input on the SR circuit diagram.

Those of you who think you know it all and then have me explain something you didn't know can take comfort in the fact that JayUtah doesn't know it all either and delights in having these things explained to him.

JayUtah
2003-Jul-30, 10:47 PM
When I was young I asked an electrical engineer to explain a transistor to me. Big mistake. I got the whole lecture about doping and boundary effects blah blah. Finally I got someone to explain to me that it was an electrically-operated switch, and a whole bunch of fog suddenly cleared. That's why I take a multi-tiered approach to trying to explain things. I typically forego rigor initially and just say, "It kind of works like this." I can later adorn that conceptual framework with the details of fine behavior and implementation.

Joe Durnavich
2003-Aug-01, 02:47 AM
Although I am only 43 (....wait I should check that.....yep, only 43) two things in this discussion make me feel old. One is all this talk of these new-fangled "transistors." When I was involved in amateur radio, from age 12 in '72 and for the next several years, all the radio equipment was primarily based on vacuum tubes. I tried to learn about transistors, but they always seemed like an alien technology to me and I didn't see much use for them. I have to admit I still feel pretty much the same way, although I have a better grasp of them after reading this thread.

The second thing that make me feel old is that that Jesus Christ Hacker fellow doesn't understand that video can be modulated on and transmitted as an analog signal. He probably wonders what early beta version of MPEG was used in the 1940s era television sets.

I do sympathize with his feeling that sending a video signal from the moon to the earth is somewhat unbelievable. It really is an incredible task. What made it work, though, were the incredibly huge parabolic dishes at the receiving stations. Video from satellite transponders with a power output of 5 to 11 watts or so, 22,000 miles above the earth has routinely been received by 12-foot dishes on the ground. A LM on the moon is roughly 10 times farther away with similar power (18.6 watts) so to scoop up the same amount of signal, you need a dish 10 times larger. NASA's ground stations used 85-foot and 210-foot dishes. The 85-footers could receive video, but with some noise, whereas the 210-footers provided excellent quality video.

2003-Aug-01, 04:46 AM
I started an E-mail dialogue with the site's poster just for curiosity's sake. First he tried to buldoze me with his theories. I shot them down. He would bring up one point after another, and I would shoot it down again using the things I have learned here at the BABB (thanks, everyone! =D> ).

When I challenged him to refute my points, he would just switch to another tack that I would shoot down again. He got more and more hostile and shrill as I went along. :-?

Finally he accused me of being either part of the CIA, or not part of the CIA--I couldn't tell which--ending with a demand that I stop E-mailing him.

Typical pattern of conspiracists, I take it?

I feel pretty good about myself--I went toe-to-toe agaist a conspiracist, kept my temper, refuted his ideas with facts...and he was the one who blinked. Thanks again, BABB! =D>

Darn, that sounded so much like a diet plan advert... :oops:

Humphrey
2003-Aug-01, 06:48 AM
congrats Avatar! :-)

AstroSmurf
2003-Aug-01, 08:07 AM
Although I am only 43 (....wait I should check that.....yep, only 43) two things in this discussion make me feel old. One is all this talk of these new-fangled "transistors." When I was involved in amateur radio, from age 12 in '72 and for the next several years, all the radio equipment was primarily based on vacuum tubes. I tried to learn about transistors, but they always seemed like an alien technology to me and I didn't see much use for them. I have to admit I still feel pretty much the same way, although I have a better grasp of them after reading this thread.
I'm a bit younger (29), but transistors are still somewhat mysterious until you start fiddling with their amplifying capabilities. I had the fortune to have access to some books which explained the concept well, and I did some experimenting using components from a dismantled TV. Once you get the hang of it, it's fairly easy to design simple amplifiers.

Transistors have their limits, though. Sometimes, such as in hi-fi applications or when you work with high-power signals, tubes are still the best choice.

The second thing that make me feel old is that that Jesus Christ Hacker fellow doesn't understand that video can be modulated on and transmitted as an analog signal. He probably wonders what early beta version of MPEG was used in the 1940s era television sets.
Heh, on that subject, doing MPEG2 compression of a live video feed isn't really feasible without substantial equipment. I did a short-term project for one of the subcontractors of the swedish broadcast network, and the electronics stuff took up several racks, just to process a single feed. Granted, it's probably possible to miniaturize further than they did - the equipment was designed for flexibility and maintainability, not for size.

To return to the topic of this board, I understand that some of the movies from the missions were originally recorded with a 'chemical' signal. :wink: So for those, bandwidth limitations do not apply at all.

Kaptain K
2003-Aug-01, 11:06 AM
Transistors have their limits, though. Sometimes, such as in hi-fi applications or when you work with high-power signals, tubes are still the best choice.
Huh! :-? I'm not sure what you mean by "hi-fi applications" or "high-power signals". The QSC PL 9.0 is considered to be very accurate, is capable of 4500 watts per (stereo) channel into a 4 ohm load and weighs less than 28 Kg! Match that with any tube amp!

captain swoop
2003-Aug-01, 11:28 AM
Transistors have their limits, though. Sometimes, such as in hi-fi applications or when you work with high-power signals, tubes are still the best choice.
Huh! :-? I'm not sure what you mean by "hi-fi applications" or "high-power signals". The QSC PL 9.0 is considered to be very accurate, is capable of 4500 watts per (stereo) channel into a 4 ohm load and weighs less than 28 Kg! Match that with any tube amp!

The thing with tubes is they add pleasant 'colour' distortions to the sound.

I have a Vox AC30, Marshall valve head and a marshall tranny head, Strictly speaking the tranny is probably putting out a spot on sound but the valve heads sound nicer.

Donnie B.
2003-Aug-01, 08:36 PM
Vacuum tubes (thermionic valves to our British friends) still have their uses. Most of us are probably still looking at one as we read this BB!

They do have very nice characteristics for some applications. When overdriven, audio amps based on tubes produce more "melodious" distortion products (generally even-order harmonics) than do bipolar transistors, which tend to produce harsh, odd-order harmonics when they clip.

However, this characteristic less a function of the tubes themselves than the fact that tubes can't drive low-impedance loads (like loudspeakers) directly. You have to use a big honkin' transformer to couple the tubes to the voice coils. A saturated transformer "sounds" nicer than a clipping transistor. You could make a transistor-based amp sound nicer if you transformer-coupled it. But audio power transformers are big, heavy, and costly.

Several people have done power MOSFET designs that incorporate circuits to make them behave more like tube amps. I haven't heard any of these, so I can't say how successful they are. There are still plenty of diehard audiophiles that swear by tubes, but I'm not one of them. Unless you're 'way overdriving your equipment, or somebody blundered in the design of the circuit, even moderately-priced solid state amplifiers are cleaner, quieter, and more accurate than all but the most exotic and expensive tube equipment.

I once read an article about somebody who set up blind listening tests between tube and transistor equipment. The listeners were people who were used to, and swore by, tube equipment. At first, the tube amps won all the tests easily. Then, in a second round of tests, nobody could tell the difference (results were no different than chance). Why the change? The testers had added a little 60 Hz hum and second-harmonic distortion to the transistor amps.

This reminds me of an experience I had with video monitors some years back. I had an old monitor I loved, but my company replaced it with a newer model. I hated it. It seemed harsh and "edgy" to me. But some months later, I found the old monitor in a storeroom and reclaimed it. When I hooked it up, I was shocked -- it was terribly fuzzy and defocused. I immediately went back to the new, crisp, clean monitor and put the old one in the storeroom for good. The point is, the new monitor had been better all along, but because it wasn't what I was used to, I didn't recognize it at first. Only after I got accustomed to seeing things "right" could I make a valid comparison.

Donnie B.
2003-Aug-01, 08:39 PM
When I was young I asked an electrical engineer to explain a transistor to me. Big mistake. I got the whole lecture about doping and boundary effects blah blah. Finally I got someone to explain to me that it was an electrically-operated switch, and a whole bunch of fog suddenly cleared. That's why I take a multi-tiered approach to trying to explain things. I typically forego rigor initially and just say, "It kind of works like this." I can later adorn that conceptual framework with the details of fine behavior and implementation.
Jay, I didn't mean to imply any criticism of your discussion of transistors. I hope you don't mind that I took on the "second tier" description.

Glom
2004-Mar-25, 06:20 PM
We did SR latches in digital electronics labs last week. I now know how to store one bit of information. One bit down, only another four hundred billion to go to before I can build a hard drive.

I sussed it out like this:



A B Q /Q
0 1 1 0
1 0 0 1
1 1 remembers remembers
previous Q previous not Q


The remembering previous Q was interesting. If we cheated the actual circuit slightly by earthing both inputs at the same time so there was a 0 0 input, the output was 1 1 (both LEDs lit up). When we pulled the earths to both inputs at the same time it was the case that Q (the green LED) was always 1 and /Q (the red LED) was always 0, because when the circuit went to memory, Q remembered it was previously 1 and stayed that way and /Q remembered that not Q was previously 0 and showed that.

Similarly, when we switched off the power, both the LEDs went out of course, but when we switched it on again, the red LED always lit up at first while the green LED always stayed off (assuming we left the circuit on memory). That's because Q remembered it was 0 prior to the circuit being turned on and stayed that way and /Q remembered /Q was 1 and displayed that. That was cool.

Donnie B.
2004-Mar-26, 03:24 AM
Similarly, when we switched off the power, both the LEDs went out of course, but when we switched it on again, the red LED always lit up at first while the green LED always stayed off (assuming we left the circuit on memory). That's because Q remembered it was 0 prior to the circuit being turned on and stayed that way and /Q remembered /Q was 1 and displayed that. That was cool.
Sorry to disillusion you, but unless you were using a pretty exotic circuit, it was not remembering its state through a power cycle. Any SR-type latch will power-up in an indeterminate state, but that doesn't mean it's random. One of the devices will always power up a little faster or earlier than the other, or one transistor will have slightly higher gain, and that determines the circuit's turn-on state. Or, if the passive devices (resistors etc) are not perfectly matched, they can determine the turn-on state.

It takes some very special tricks (such as EEPROM or Flash topologies) for a memory circuit to preserve its state when power is cycled.

Um... on second thought...
Unless, of course, you're leaving the inputs tied high-low (or low-high), which of course will force the circuit into the corresponding state upon power-up.

Wingnut Ninja
2004-Mar-29, 05:18 AM
(The first person to mention tri-state logic gets wonked on the head. :-) )

I realize this is old, and I realize I'm risking a head-wonking, but would this be on, off, or no signal? (i.e., 1, 0 or Z)

I went nuts hunting down all the Z's and C's in my circuits last semester. One flipped buffer and your finely tuned RAM is just a bunch of gibberish.

TrAI
2004-Mar-29, 08:17 AM
I realize this is old, and I realize I'm risking a head-wonking, but would this be on, off, or no signal? (i.e., 1, 0 or Z)

I went nuts hunting down all the Z's and C's in my circuits last semester. One flipped buffer and your finely tuned RAM is just a bunch of gibberish.

Tri-state outputs can have the states 0 and 1 like normal logic components, but can also shut of the output to prevent it from being a load on a bus type interface where several units would have to send... If one unit were trying to put the bus to the 1 level while the others were presenting 0 level, it would be as good as a short circuit, after all...

One might also think of the possibility for trinary logic, where one have three states, -1, 0, 1 or 0, 1, 2...

JohnOwens
2004-Mar-30, 04:25 AM
One might also think of the possibility for trinary logic, where one have three states, -1, 0, 1 or 0, 1, 2...
I started in on this (the -1,0,1) about 10-12 years ago. I had to make up my own name for it, so I dubbed it "nullcentric trinary" to distinguish it from the more conventional 0,1,2 trinary. One surprising thing I found about it was that once you got used to it, long division and multiplication were quite simple to do with this system.
By the way, since it's difficult to write with -1,0,1 as your digits, the scheme I came up with used -,0,+ instead, which could admittedly become confusing when adding or subtracting. Anybody know of a different convention for this?

Added: P.S. And yes, it was inspired by the trinary computers in Heinlein's works, although I don't recall those being described as nullcentric (by that name or any other). As far as I knew & know, the nullcentrism was my own innovation, although I'd be rather surprised if no one had played with it at all before me.

Donnie B.
2004-Mar-31, 01:12 AM
Typically, the three states of tri-state binary outputs (not trinary) are denoted 0, 1, and high-impedance (high-Z or just Z). As stated in an earlier post, the third state is "off" or open-circuit.

Some logic devices have more complex I/O (input/output) pins that can be programmed to be either inputs or outputs; these can always be tri-stated (made high-impedance) by making them inputs. One familiar microprocessor family (you probably have one in your keyboard) has quasi-I/O pins: when they are set to output a 1, they are relatively high impedance and can be pulled low by external circuitry, thus becoming an input pin. When set to output a 0, they are relatively low impedance, so they can drive external circuitry (such as LEDs).

JohnOwens
2004-Mar-31, 07:13 AM
Typically, the three states of tri-state binary outputs (not trinary) are denoted 0, 1, and high-impedance (high-Z or just Z). As stated in an earlier post, the third state is "off" or open-circuit.
Just to clarify, I'm not sure if you were trying to explain this to me or to expand on another post, but my post was all about mathematical use of trinary, not about electronic logic use. Just a mental exercise I indulged myself in a while back.

Irishman
2004-Mar-31, 09:21 PM
JohnOwens, since there are just three states and two values (or is that one value?), you could just use "N,O,P" as your designators. ;)

Glom
2004-Mar-31, 09:29 PM
Okay, now I'm confused. Could someone explain the difference between tri-state binary and trinary, please?

Wingnut Ninja
2004-Mar-31, 09:43 PM
Okay, now I'm confused. Could someone explain the difference between tri-state binary and trinary, please?

Tri-state logic is the electronic sense of having either a high signal, a low signal, or no signal/high impedence, represented as 1, 0 and Z, respectively. It's used in designing circuits as has been previously explained.

Trinary is a base-three counting system, much like binary (base 2), decimal (base 10), or hexidecimal (base 16).

Glom
2004-Apr-01, 12:09 PM
But isn't tri-state still counting base three like trinary?

Donnie B.
2004-Apr-01, 11:19 PM
But isn't tri-state still counting base three like trinary?
No, tri-state is just a special case of binary. It's a technical enhancement that allows parts of a circuit to be isolated.

Think of it this way: when it's 0 or 1, it's "on-line" and active. When it's Z, it's off-line or "out of the loop". The Z state is not another digit.

Or to use the old light bulb analogy, when it's 0, it's off. When it's 1, it's on. When it's Z, it's out of the socket.

AstroSmurf
2004-Apr-02, 12:17 PM
Tri-state is more like 'true', 'false' and 'don't ask me, I just work here' :P

Andreas
2004-Apr-04, 01:15 AM
One might also think of the possibility for trinary logic, where one have three states, -1, 0, 1 or 0, 1, 2...
I started in on this (the -1,0,1) about 10-12 years ago. I had to make up my own name for it, so I dubbed it "nullcentric trinary" to distinguish it from the more conventional 0,1,2 trinary. One surprising thing I found about it was that once you got used to it, long division and multiplication were quite simple to do with this system.
Actually, they are usually called "ternary" for base 3 and "balanced ternary" for using -1,0,1. And sure it's simple to do division and multiplication there, it's trivial in binary after all. Balanced ternary is pretty much binary with a negative digit add-on.


By the way, since it's difficult to write with -1,0,1 as your digits, the scheme I came up with used -,0,+ instead, which could admittedly become confusing when adding or subtracting. Anybody know of a different convention for this?
D. E. Knuth uses an overlined 1 where he discusses balanced ternary in his The Art of Computer Programming, Volume 2. Other people seem to use similar notations.

An early discussion of balanced ternary appears in the American Mathematical Monthly 57 (1950) and the Russians built an experimental computer based on balanced ternary -- the SETUN -- in 1958.