开发者

Best algorithm to minimize an output value by varying input data

I have an incoming stream of data and a set of transformations, which can be applied to the stream in various combinations to get a numerical output value. I need to find which subset of the transformation minimizes the number.

The data is an ordered list of numbers with metadata attached to each one.

The transformations are quasi-linear: they are technically executable code in a Turing-complete language, but they are known to belong to a restricted subset which always halts, and they transform the input number to output number with arithmetic operations, whose flow is dependent on metadata attached. Moreover, the operations are almost all the time linear (but they are not bound to be开发者_JAVA百科—meaning this may be a place for optimization, but not restriction).

Basically, a brute-force approach involving 2n steps (where n is a number of transformations) would work, but it is woefully ineffective, and I'm almost absolutely sure this would not scale in production. Are there any algorithms to solve this task faster?


If almost all operations are linear, can't you use linear programming as heuristics?

And maybe in between do checks whether some transformations are particularly slow, in which case you can still switch to brute force.

Do you need to find the optimal output?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜