LED bulbs with screw bases have cooling requirements which
are limited to ~10W or less (some would argue 5W) using
convection cooling and integral heatsink.
I actually put the limit closer to 3 watts if you want to approach the claimed 100,000+ hour life. 5 watts with passive cooling might give you 25,000 hours life (not bad, but far short of the potential LEDs offer). 10 watts and no active cooling will cook the LEDs in a matter of days.
As for the original question, 200 watts incandescent is roughly 3500 lumens. To obtain that given the aforementioned 3 watt thermal limit you would need LEDs which convert roughly 80% of their input power into light. So if you input ~14 watts, you'll end up with around 11 watts of light (~3500 lumens depending upon the spectrum) and 3 watts of heat. In other words, don't look for it any time soon, if ever. The best white LEDs now are a bit over 40% efficient. It's going to take a lot longer to go from 40% to 80%, compared to 20% to 40%. The physics get that much harder as you try to eek out every last bit of efficiency.
On the other hand, matching the output of a 200, 300, or even 500 watt incandescent lamp is possible with today's LEDs in a purpose-built fixture. For a 200 watt equivalent, you might need about 25 R5 XP-Gs driven at 350 mA, and mounted on decent heat sink. Not exactly inexpensive, but such a setup would last decades, even in a commercial setting, if well-designed.