I think you may be talking about 'normal' LEDs such as those found in moblie phone keypads & back lights - these LEDs will not get hot enough for you to really notice probably 1/3 watt TOTAL over the area the size of your palm, buried mm's inside the cover. OK, granted with what you've said, IF you put too much voltage across them they'll get too hot & bust (as will any LED). You'd have to be holding the LED to fell it burning too hot, but it will be in it's failure mode.
Now lets take these new LEDs and take for example the new application for Aquarium Lighting. Not just one LED could be used (well it
could, but would get so damned hot it would de-solder itself or would have to be cooled using a radiation system - also the light intensity would be such that it would blind you) there are many, about 50 or so. These LEDs would weigh in at 1.5W to 2.0W a pop and perhaps use about 50 (I'm talking the $2000 types here).
Look at this little baby:
http/dmcleish.com/Nichia2W/index.html
See the first picture on the penny? The little tab on the left is there such that the LED HAS to be soldered down to a large copper area so that the device can be heatsunk - i.e. connected to a PCB radiator. These things WILL get hot. however, OVERALL these lights won't get AS hot for an equivalent Halide. I think 75W of LED cluster equtes to producing 95% of the usable light that a 250W halide will, therefore only dissipating a third of the heat. As I said, consider also that this heat is spread over a greater area of 50 LEDs instead of one great big heat producing Halide.
For the cheaper systems smaller LEDs are used but again, more are need and you'll end up dissipating the same power but again even more spead out over say 200 LEDs...
Andy