It's good to hear that my intro page is getting used!
Why was 480i picked as the number of lines? Was it the best they could do at the time? Why did they decide on Interlace and not pregessive, again was this just a technical problem?
The DVD format was developed in the mid-90's (actual development probably goes back a fair bit farther than that), and at the time there was no way to
display more than 480i in the home environment. Therefore it made perfect sense to develop along those lines.
I am surprised that film has not adapted a higher frames per second rate? Why didn't film use a 30 Frames per sec like TV did at the time?
The infrastructure for film is getting close to a century old now, and the 24fps speed still works well - moving to 30fps would be a 25% increase in the amount of film used (with costs across the board - particularly film stock, developing, and distribution) and would require the retrofitting or replacement of an insanely high number of cameras. It's just not practical to do so.
Does a CRT TV actually use all 480 lines of resolution? I thought I read somewhere that SD analog TV singnal would use less then 480 where as a standard DVD would use more of the 480 lines. The VHS format somewhere in between??
A CRT television (standard definition) can use all 480 lines, although not all sources offer that many lines. I believe that standard VHS lands around 250 or so, for example, with S-VHS going up to the full 480.
Even though a LCD excepts a 1080I input, does it refresh it on the screen painting the Odd lines and then the Even lines or is it done all at once?
You actually touched on one of the principal points of the new Secrets 1080p article: a fixed-pixel display device like DLP or LCD is inherently de-interlaced. Plasma's the same way, for that matter, as I understand it. It
can't display interlaced source material without de-interlacing it first. All of the pixels (or mirrors, in DLP) get refreshed at once every 1/60th of a second. If an LCD display with 720 lines of resolution gets a 1080i input, it has to both scale and deinterlace the input before it can display it (usually by deinterlacing first and then scaling the 1080p signal).