Wow! Thank you all! I'll be doing a lot of reading (and constant checking of wikipedia and the dictionary) in the days to come. Thanks so much.
jiripolivka, thanks for your input. For me, there is no waste of time, energy or money in seeking understanding. It's like they say: you learn so much more when you fail. Thank you also Manuel and RF_Jim. I've been working away at teaching myself this stuff in my free time, trying to understand stuff on the Internet, which can leave you wondering where to start. It's such a wonderful surprise to find a community of such helpful people.
@RF_Jim: I have a couple of questions, and refinements on my earlier questions. From what I was able to understand from a brief research foray into the friis transmission equation, it seems to describe the power interactions between two idealized antennas (I used wikipedia, I admit it), but in my area we have quite a few different antennas around, and from what I can see it seems that they are all available sources of energy. My question was not about harvesting energy from the cell signal that I hope to amplify, but from the other noise in the line--all those other frequencies that would normally be filtered or tuned out. Still sounds like we're talking about milliwatts, I realize, but if we consider all the energy from all the frequencies that are normally filtered out, it would seem to be, well, some energy.
But, milliwatts, huh. I have this app on my phone, downloaded all the geeky antenna ones that I could find, and it tells me the current received strength of my cell signal. Right now, it's -98dBm. Looking that up, it seems we're talking about the picowatt range, which, I think is 9 whole orders of magnitude less powerful than a milliwatt (10 to the -12W v. 10 to the -3W). Probably why I get such terrible, terrible cell reception. But I don't understand, to return to my example of an otherwise very simple passive signal repeater, why, leaving aside the technical problems that jiripolivka pointed out, a few extra milliwatts boosting the signal wouldn't be a good thing, and possibly a very good thing. If my phone barely works at -98dBm, it seems like it should work well enough that I don't drop calls with just an extra picowatt of signal or two reaching the phone. My math and understanding fails me when it comes to calculating the signal loss across a medium-sized room, but could a signal drop from a milliwatt to a picowatt of power in a few meters? Remember, I'm not talking about a classic amplified signal repeater, which, I understand, does operate in the watt range.
Thanks again for your responses!