Make sector production more predictable
produce() limits production to how many units the workers can produce, rounding randomly. It charges work for the units actually produced, discarding fractions. If you get lucky with the random rounding, you may get a bit of extra work done for free. Else, you get to keep the unused work, and may even be undercharged a tiny bit of work. Has always been that way. The production command assumes the random rounding rounds up if and only if the probability to do so is at least 50%. Thus, it's frequently off by one for sectors producing at their worker limit. The budget command runs the update code, and is therefore also off by one, only differently. Rather annoying for tech and research centers, where a single unit matters. A tech center with full civilian population can produce 37.5 units in 60 etus. Given enough materials, it'll fluctuate between 37 and 38. Production consistently predicts 38, and budget randomly predicts either 37 or 38. Both are off by one half the time. Fix this as follows: limit production to the amount the workers can produce (no rounding). Work becomes a hard limit, not subject to random fluctuations. Randomly round the work charged for actual production. On average, this charges exactly the work that's used. More importantly, production and budget now predict how much gets produced more accurately. They're still not exact, as the amount of work available for production remains slightly random. This also "fixes" the smoke test on a i686 Debian 6 box for me. The root problem is that floating-point subexpressions may either be computed in double precision or extended precision. Different machines (or different compilers, or even different compiler flags) may use different precision, and get different results. Example: producing 108 units at one work per unit, sector p.e. 0.4 needs to charge 108 / 0.4 work. Computed in double precision, this gets rounded to 270.0, then truncated to 270. In 80 bit extended precision, it gets rounded to 269.999999999, then truncated to 269. With random rounding instead of truncation, the probability for a different result is vanishingly small. However, this commit introduces truncation in another place. It just happens not to mess up the smoke test there. I doubt this is the last time this kind of problem upsets the smoke test.
This commit is contained in:
parent
739852dd3f
commit
6f7c93cdad
4 changed files with 2099 additions and 2098 deletions
|
@ -82,7 +82,7 @@ prod(void)
|
|||
int there;
|
||||
int unit_work; /* sum of component amounts */
|
||||
int mat_limit, res_limit;
|
||||
int worker_limit;
|
||||
double worker_limit;
|
||||
i_type it;
|
||||
i_type vtype;
|
||||
unsigned char *resource;
|
||||
|
@ -217,12 +217,12 @@ prod(void)
|
|||
if (unit_work == 0)
|
||||
unit_work = 1;
|
||||
|
||||
worker_limit = (int)(work * p_e / (double)unit_work + 0.5);
|
||||
worker_limit = work * p_e / (double)unit_work;
|
||||
res_limit = prod_resource_limit(pp, resource);
|
||||
|
||||
max_consume = res_limit;
|
||||
if (max_consume > worker_limit)
|
||||
max_consume = worker_limit;
|
||||
max_consume = (int)worker_limit;
|
||||
material_consume = MIN(max_consume, mat_limit);
|
||||
|
||||
prodeff = prod_eff(type, natp->nat_level[pp->p_nlndx]);
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue