You can make a reasonable estimate of how much input electrical power is converted to light. The rest is converted to heat.
The maximum theoretical efficacy is dependent on spectrum. This is because a lumen is a unit of perceived brightness by the human eye, and a watt of green light looks brighter than a watt of red or blue light. So if they converted 100% of input power to light, a green LED would look brightest, yellow next, and red and blue the dimmest, all other things being equal. The exact relationships would depend on the exact wavelengths chosen. At the far ends of human perception, the transitions to ultraviolet or infrared, a watt of light would have zero lumens.
Since "white" light is a mix of various colors, you'd expect white to have somewhere between the maximum (683 lm/W at 555 nm) and the min, zero. Most white LEDs have a spectrum that translates to 300-340 lm/W, with those being heavy in greens and yellows being at the higher end, and those heavy in red and/or blue being at the low end. Note that this is entirely due to the spectrum of the LED, not the quality, how it's pumped, how hard it's driven, etc. (except as those things impact the spectrum).
So if you have a neutral white LED with decent CRI, it would be reasonable to assume you are getting a spectrum with around 320 lm/W. If this LED is emitting only 100lm/W, then it is converting
100lm/W / 320lm/W * 100 = 31% of the input power to light. The remainder, 69%, is heat.
To me this is easier to grasp the other way around. If your LED is converting 31% of 1 watt of input power into light, that's 0.31W of light output. If you multiply this by 320 lm/W, you get 100 lm!
Keep in mind that other parts of your system will absorb some of the light, converting it to heat and reducing the overall system efficacy. It's not hard for lenses, reflectors, filters, etc to absorb 20% or more of your emitter's output. You'd have to judge whether this would impact your heatsink calculations.