I know that the size of the buffer can play an important role in an application performance, however what is the best way to define a buffer size? What should I think about when sizing?
Something like “best practices”
To demonstrate that buffer size is actually an important factor in performance, you must measure the performance and try different sizes to compare. At that point, you already have a method to find a good value (simply keep testing and comparing).
As far as what to think about when sizing a channel, that may be a different question. The first thing to consider is the correctness of the program: meaning, will the program operate as intended and without faults. As mentioned in the comments, in a vast majority of cases the "correct" value is either 0 or 1, in synchronized and un-synchronized communication respectively.
If your channel does require a buffer size of more than 1, then you must determine the upper bound, or "worst case" number of values that the channel must hold at once to avoid deadlocks. If you can’t determine what that number is exactly, that’s a good sign that there is no upper bound. For example, if you have a recursive routine that sends messages, there may be no upper bound. If that is the case, you must redesign your program to store the values in a dynamic, way such as a slice.
Answered By – Hymns For Disco
Answer Checked By – Cary Denson (GoLangFix Admin)