We built masses of gear using opto triac drivers & triacs some years ago.
That all used MOC3022 (non zero crossing) drivers and BTA08-400 triacs, with just a 180 Ohm series resistor in line between gate and A1. The resistors were 0.6W MRS25 type, as we use for just most things.
(As it was phase control rather than zero crossing, as well as a fuse in each channel and common supply, they had a series choke and RC snubber for suppression on the AC side of each output channel).
The optos were driven with 2N7000 mosfets rather than direct from the MCU, to guaranteed the correct firing current.
The main triac should fire with microseconds of the opto triac switching on if there is any appreciable voltage across the triac, so in that instance it's to limit the worst-case current at peak voltage to something that will not damage the components - the opto or the triac gate, in the time before the main triac is fully on and bypassing the trigger circuit..
That sets a minimum value, depending on the components and AC voltage.
The resistor should only take current for such a short time, it does not dissipate any real power.
The other side is if the opto triac is on starting at a zero crossing, the resistor in effect sets the point the voltage across the main triac is sufficient to cause the gate current through the resistor to trigger the triac.
That ideally needs a fairly low value, so something mid-range between those extremes in a practical design for phase control.
A lower value such as 100 Ohms should be fine for zero crossing opto drivers.
The datasheets for the various opto triacs generally give example circuits and values.
If the circuit will be used with an inductive or capacitive load, things are more complex, needing snubbers etc. for inductive loads - and you should only use the zero-crossing (zero voltage switching) opto triac if the load has significant capacitance, such as many electronic ballast lamps.