You have several pieces of misinformation here that I feel is important to correct...
This is generally true with true 'televisions', but obviously, there are a ton of computer monitors out there which are 1920x1080 or higher resolution and are readily available in sizes well under 30". Resolution creating clarity is a function of viewing distance AND screen size. It is 100% incorrect to ever make a claim that it is screen size alone. If I'm a foot from a 20" display then 1920x1080 is definitely going to deliver more detail than 1280x720 - as will most other average people.
This is not true as I know of a couple of projectors at least (Optoma?) which support 1080p input resolution, but they are only 720p projectors. Just because I may be able to understand French does not mean that I can also speak it.
Except of course, 1080p can be sent over VGA, Component, and DVI.
Actually, HDMI resolution settings are determined by the device sending signaling to the source which let it know which HDMI resolutions are fully supported. A 1080p TV could support resolutions exceding 1080p if the processing was designed to do so. Or, the display may only support 720p or 1080i resolutions - which was a huge headache for many displays just a year or so ago.
Yes they do, and more experienced people do as well.
Not really. One field (not frame) of a 1080i signal is the odd or even lines of resolution from a 1920x1080 frame. There are 60 unique fields per second and every other field comprises the odd or even lines of a frame. Since it is truly a 1920x1080 frame that is trying to be achieved it is critical to have very good processing which treats every moment as a 1080p goal, not 1920x540.
Many displays simply treat 1080i like 1920x540 and discard a lot of important information and don't properly evaluate the frames to produce smooth video. 1080i is far more difficult than 1080p to deal with.
Actually it depends on the source content as to which looks better, but with movies which are shot at 24fps, to convert to 1080i/60 the full resolution of 1920x1080 is included across the interlaced frames and a device which properly deinterlace can produce 100% of the full resolution of 1080p when fed a 1080i source created from 24fps film stock.
This is why the 'benefit' of 1080p is often considered to be a stupid point... and something I agree with.
What matters far more than 1080i/1080p, is 24hz display output, or multiples thereof, as this can remove judder from movies.