Rendered at 10:37:42 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
zitterbewegung 2 days ago [-]
This is something I always thought if you could make an Optical Computer using MZI or other technologies that don't have very "exact" requirements on computation. Similar to how LLMs right now where models are run on consumer devices like a Macbook Pro where we quantize to 4 bit computations you could hypothetically just run a larger model using MZI's doing inference on those systems.
Since you are only changing the underlying model every so often instead of doing a large training loop when you setup the optical computer that can do inference it scales 2n+1 with clock speeds of to 100THz with only 100w of power vs traditional GPUs at 2GHZ with 1Kw for 15k cores.
LeroyRaz 2 days ago [-]
This reads like something AI generated...
refulgentis 2 days ago [-]
"That’s not a lab toy. That’s a product-trajectory data point."
sigh. (why? because now I have to guess how much is vague handwaving, or an AI trying to fit a square peg into a round hole, and how much is reality)
refulgentis 2 days ago [-]
I get the "AI uses < 32 bit weights sometimes" thing, intimately, but I feel like I'm missing:
A) Why that means calculations can be imprecise - the weights are data stored in RAM, is the idea we'd use > N-bit weights and say it's effectively N-bit due to imprecision, so we're good? Because that'd cancel out the advantage of using < N-bit weights. (which, of course, is fine if B) has a strong answer)
B) A aside, why is photonics preferable?
aeonfox 1 days ago [-]
A) Wasn't the article suggesting that would be 4-bits end-to-end in this hypothetical photonic matrix multiplication co-processor? ie. the weights are 4-bits
B) Power consumption and speed. Essentially chips are limited by the high resistance (hence heat loss) of the semiconductor. Photonics can encode multidimensionally, and data processing is as fast as the input light signal can be modulated and the output light signal can be interpreted. I guess this would favour heavy computations that require small inputs and outputs, because eventually you're bottlenecked by conventional chips.
adrian_b 1 days ago [-]
While power consumption and speed are indeed the advantages, depending on the concrete application it may be very difficult or even impossible to make an optical computing device with a size comparable with an electronic device.
The intrinsic size of the optical computing elements is much larger, being limited by wavelength. Then a lot of additional devices are needed, for conversion between electrical and optical signals and for thermal management.
Optical computing elements can be advantageous only in the applications where electronic devices need many metallic interconnections that occupy a lot of space, while in the optical devices all those signals can pass through a layer of free space, without interfering with each other when they cross.
This kind of structure may appear when doing tensor multiplication, so there are indeed chances that optical computing could be used for AI inference.
Nevertheless, optical computing is unlikely to ever be competitive in implementing general-purpose computers. Optical computers may appear but they will be restricted to some niche applications. AI inference might be the only one that has become widespread enough to motivate R&D efforts in this direction.
aeonfox 12 hours ago [-]
> unlikely to ever be competitive
Bold claim to say these challenges will never be surmounted. Either a more-economic technology would have to mature first, or civilisation halt progress for that to be true. If scientific advances could yield miniaturised photonics that offer a significant cost/benefit over any contemporary technology the concept will still be pursued. Unless you are suggesting that it is theoretically and physically impossible?
irickt 2 days ago [-]
An interesting and accessible article on the increased plausibility of photonic compute.
Since you are only changing the underlying model every so often instead of doing a large training loop when you setup the optical computer that can do inference it scales 2n+1 with clock speeds of to 100THz with only 100w of power vs traditional GPUs at 2GHZ with 1Kw for 15k cores.
sigh. (why? because now I have to guess how much is vague handwaving, or an AI trying to fit a square peg into a round hole, and how much is reality)
A) Why that means calculations can be imprecise - the weights are data stored in RAM, is the idea we'd use > N-bit weights and say it's effectively N-bit due to imprecision, so we're good? Because that'd cancel out the advantage of using < N-bit weights. (which, of course, is fine if B) has a strong answer)
B) A aside, why is photonics preferable?
B) Power consumption and speed. Essentially chips are limited by the high resistance (hence heat loss) of the semiconductor. Photonics can encode multidimensionally, and data processing is as fast as the input light signal can be modulated and the output light signal can be interpreted. I guess this would favour heavy computations that require small inputs and outputs, because eventually you're bottlenecked by conventional chips.
The intrinsic size of the optical computing elements is much larger, being limited by wavelength. Then a lot of additional devices are needed, for conversion between electrical and optical signals and for thermal management.
Optical computing elements can be advantageous only in the applications where electronic devices need many metallic interconnections that occupy a lot of space, while in the optical devices all those signals can pass through a layer of free space, without interfering with each other when they cross.
This kind of structure may appear when doing tensor multiplication, so there are indeed chances that optical computing could be used for AI inference.
Nevertheless, optical computing is unlikely to ever be competitive in implementing general-purpose computers. Optical computers may appear but they will be restricted to some niche applications. AI inference might be the only one that has become widespread enough to motivate R&D efforts in this direction.
Bold claim to say these challenges will never be surmounted. Either a more-economic technology would have to mature first, or civilisation halt progress for that to be true. If scientific advances could yield miniaturised photonics that offer a significant cost/benefit over any contemporary technology the concept will still be pursued. Unless you are suggesting that it is theoretically and physically impossible?