Replies: 12 comments 22 replies
-
My implementation is here. Tiny SuperOptimizerTiny SuperOptimizer is a program synthesis tool that generates a minimum Bril program from an input Python function. IntroductionThe idea of implementing a superoptimizer comes from this paper. Given an instruction set, the superoptimizer finds the shortest program to compute a function. DemoTiny SuperOptimizer takes a Python function as input, for example: def f(x, y):
a = x * 2 + y
b = a * 5
c = b * (128 // 4)
return c The tiny superoptimizer takes in the function and arg names as input, synthesize the minimum program and convert to Bril's json form: tree, holes = superoptimize(f, ['x', 'y'])
converter = to_bril(tree, holes)
print(converter.bril_prog) The result looks like this: $ python main.py | bril2txt
@main(x: int, y: int) {
v20: int = const 160;
v1: int = mul v20 y;
v12: int = const 6;
v0: int = shl x v12;
v0: int = add v0 v1;
print v0;
} As the result, the output program is a simplified version of the input Python function. A print instruction is added to display the result. ImplementationThe idea of synthesizing a minimum program is traversing the search space of The optimizer enumerates all combinations of instruction choices and formulate a proof problem for each program. There are four steps:
The two key parts of the process are generating operand choice encodings and enumerating all cases in the search space. Encode operand choicesThe operand of an instruction can be one of the following situations:
We the instruction set to binary operations, thus The optimizer encode the operand as either one of the input variables or a constant. For example, with three inputs
Generate the search spaceWith the operand encoded, The optimizer can enumerate all operation combinations given the number of instruction DiscussionWe limit the "instruction set" to a small group of binary operations: # set of operations used in superoptimization
# sorted by computation cost, from low to high
self.ops = ['<<', '>>', '+', '-', '*', '/'] And the optimizer chooses operations with cheaper computation cost over the expensive ones. Even though the scope of problem is limited, the search space still grows exponentially when |
Beta Was this translation helpful? Give feedback.
-
Code is here ImplementationI followed James Bornholt's Rosette tutorial. I defined a simple AST containing if statements, some arithmetic operations, and some equality operations. The AST also contains
We can then define a sketch and assert that the interpreted sketch is always equal to the interpreted concrete expression. Then calling evaluate on the sketch yields an equivalent expression. For example, if I wanted to get an equivalent if expression for an absolute value expression, I could write:
which results in the expression
I get EvaluationRosette was able to synthesize some simple expressions. Another example that worked was trying to synthesize a multiplication expression equivalent to a square expression:
which resulted in the expression There were many sketches that Rosette got tripped up on. At first I included an arithmetic shift operation in the AST and tried synthesizing a multiplication expression equivalent to a left shift of 2, but Rosette said the sketch was unsatisfiable. I even tried something as simple as DiscussionUsing Rosette was surprisingly challenging to use. When a sketch is unsatisfiable, Rosette responds with the same error message every time:
which doesn't really help me find the issue. So I still have no idea why it was struggling with shift operations and simple multiplication. |
Beta Was this translation helpful? Give feedback.
-
SummaryI used the sketch builder from https://github.com/sampsyo/minisynth/blob/master/ex2.py, and built off this sketching program. To add extra features, I considered adding comparison operators like less than, less than equal, greater than, greater than equal, equals and not equals. I then implemented these in ex2.py. I also added test sketches to see whether a solution could be synthesized for the holes. These included examples using the less than and greater than equals operator to check whether the sketch worked properly. To do something different, I also added in simple structures that resemble for loops. These are the forplus, forminus, and fortimes loops. The gist is that I unfortunately was not able to get fortimes to work the way I wanted to, perhaps because I had to convert bitvectors to ints and then back to bitvectors. Besides this though, forminus and forplus seem to work approrpriately and can be used to synthesize simple iteration. In terms of test cases, for these small, contrived examples, the solver can find a solution relatively quickly. Most of the test cases are not very large, and the time to solve the sketch is minuscule. Even for the test case named large.txt, the time to solve the sketch is relatively quick, though this may be because the expression with holes is still syntactically similar to the expression without holes. As a whole, this lesson was fairly straightforward given the code in ex2.py was already complete when I started with it. Adding on more elements was fairly simple, modulo some problems figuring out types in z3, which were quickly fixed by looking at the z3 documentation. Overall, I think synthesis seems remarkable, in that a program can be built without human intervention. I do have trouble seeing it being used in commonplace applications because it seems to scale poorly on larger sketch programs. |
Beta Was this translation helpful? Give feedback.
-
I added strings to the language in ex2. Here is my code. ImplementationI added support for escaped strings in the lark parser and removed some of the math operations to simplify the implementation for now. This language supports string replace, strToInt, intToStr, and substring functions. Unfortunately these functions can't be implemented naturally to work in both the interpretation and z3 goal cases, so I had to specialize the interp function to build to right z3 objects. The python bindings support all of these functions directly so I didn't have to do any encoding myself. Holes can be either strings or integers depending on their location in a "function" call. TestingI wrote a few example sketches, here is a subset of them:
All of the sketches I wrote were able to be easily solved by z3, but there were a few times I made mistakes in how I wrote the sketch which caused z3 to never finish. DiscussionI found Z3 pretty easy to use. Honestly, the more challenging part of this assignment was modifying the parser. Lark is fairly easy to use but I haven't used it before so I made some silly mistakes at first that didn't give errors that were easy to debug. |
Beta Was this translation helpful? Give feedback.
-
My code is here. ImplementationLacking the insight into bit operations necessary to make good sketches, I instead opted to extend the program synthesis assignment by performing program synthesis by brute forcing over the space of possible sketches, up to a fixed depth of 5. It only searches over programs that use the bit operators ( Adding
|
Beta Was this translation helpful? Give feedback.
-
In this task, I implemented a simple solver for polynomial factorization on the integer set Firstly I added power support ( After my synthesizer reads in the original polynomial, it will automatically construct the above Since I used Finally, the synthesizer works as expected. I show three non-trivial examples below, all of which prove the correctness of my synthesizer. However, when the exponent term is greater than 3, it has already taken more than 20 minutes to get the solution on my M1 MacBook. // example 1
((2 * (x ^ 2)) - (9 * x)) - 5
((1 * x) + -5) * ((2 * x) + 1)
// example 2
(((x ^ 3) + (6 * (x ^ 2))) + (11 * x)) + 6
(((1 * x) + 1) * ((1 * x) + 2)) * ((1 * x) + 3)
// example 3
((((x ^ 4) + (x ^ 3)) - (19 * (x ^ 2))) + (11 * x)) + 30
((((1 * x) + 1) * ((1 * x) + -2)) * ((1 * x) + 5)) * ((1 * x) + -3) In the first place, I wanted to implement the general factorization algorithm as the one shown in Mathematica, but it becomes very complicated when the roots of the polynomial are in complex space |
Beta Was this translation helpful? Give feedback.
-
Synthia!For this assignment I collaborated with @charles-rs. The code is here. Extending the languageWe started off with the complete sketch(including the ternary operator) from Adrian's tutorial. We first extended the language to have some bitwise operators, like
This synthesizes a "better" (kind of?) version of multiplication that does a check, and maybe just returns We had to extend the interpreter as well, as we wanted the only type of values in the program to be BitVects (16 bits). Thus even comparisons return bitvectors instead of booleans, so we have to simulate them with the Z3 "if then else" expressions, with an example being CratersOur main extension over the original sketch was actually not the bitwise operators and comparisons, but rather augmented holes. The tutorial pointed out (which I still think is like magic) that by using a conditional like However, due to the limitations of Z3, this was not trivial to implement. Here's the workflow. After constructing Z3 trees
ExperimentsWe were able to implement some interesting bit-manipulation hacks and other mathy simplifications. Quadratic
Sign
min
ChallengesSetting up emacs for python was confusing. Also Z3 switches the order of operands without permission, like |
Beta Was this translation helpful? Give feedback.
-
(Is) Program Synthesis (Possible?)Work done by @ayakayorihiro, @anshumanmohan, and @5hubh4m. Our code is here. Language ExtensionWe played around with implementing floats in the language. This was as simple as replacing So we abandoned this and settled down on another language extension which was a straightforward expansion of the language: a switch expression.
which evaluates the expression We extend the minisynth parser, interpreter, and the Z3 expression builder to support this. SynthesisOur problem statement is a bit different from class. Instead of synthesizing holes, we aim to find efficient lookup tables for entire expressions ("functions" of single-variables) which we represent as a large switch expression the values of which we leave as holes, and then let Z3 synthesize them for our expression. Additionally we add a pruning step, where we compress the lookup table by removing redundant entries if they match the default case. Examples
The last case is interesting because in Z3, all functions are total; so even division by 0 is defined, and so we see the weird result in case of 255. An Interesting Case of Z3 and
|
Beta Was this translation helpful? Give feedback.
-
Boolean Expression SimplifierThe code is here I change the original ex2 program to a Boolean Expression Simplifier: with the input of an boolean expression, it would return a simplified version. This is similar to what K-map would do. The searching phase is an iterative process, like what superoptimizer would do. How to use & Testcases
Output is as follows
Output is as follows:
Output is as follows:
Note that the simplified expression is Implementation
LimitationCan't guarantee to find the most simplified version due to the naive usage of Z3 solver, shown in the testcases |
Beta Was this translation helpful? Give feedback.
-
My extension adds Adding let expressions required passing in another parameter to the As an extension to the example used in class
checked whether variables were scoped correctly.'
|
Beta Was this translation helpful? Give feedback.
-
SetsI implemented a version of the example in the tutorial that supports sets. My code can be found here. In summary, given an input set OperatorsI have added support for the following operators
An example expression could be the following. or
TestsI provided three examples, one for each of the set operators that I added. Those are the following: Intersect
Union
Difference
CommentsIt was a little bit tricky to integrate this to the already existing synthesizer, so I did some core modifications in the parser in order to support the set operations, but now it supports only those. |
Beta Was this translation helpful? Give feedback.
-
Code is here. SummaryI followed and extended James Bornholt's tutorial on Building a Program Synthesizer. I chose to extend the DSL in the tutorial with ImplementationI basically added on all the same steps that James did in the tutorial but with my new and extended DSL. This was quite straightforward. EvaluationI played around to see if it would generate the same program for the example he synthesized. What was interesting is that it gave back a different program! It was an equivalent program. I'm not sure why this happened though. I ran the both my version and his version many times and kept getting back the same answers. So it seems deterministic for each example but we are unable to decipher why a DSL with additional constructs acts deterministically different. I wonder the impact of this in the real world. I assume that it has to do with Z3 magic behind the scenes and satisfying in different manners. But does this have real impact on synthesis? I also played around testing some other programs to make sure I could in fact synthesize programs with division and subtraction. One thing I also thought about was that division on CPU is slow. One optimization is to use optimizations to replace the division with a faster sequence of instructions. I wonder how difficult it would be to extend my code to do something like this |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
The tasks for the program synthesis lesson are extremely open-ended: just do something interesting on top of the basic synthesizer that we built in class!
Beta Was this translation helpful? Give feedback.
All reactions