Project Euler #190: Maximising a weighted product

  • + 1 comment

    X, A and m:
    x_1 + x_2 + .. + x_m = X
    a_1 + a_2 + .. + a_m = A
    For a given (a_1, a_2, .. a_m), to maximize x_1^a_1 * x_2^a_2 * ... * x_m^a_m, which is Qm(X, (a_1, a_2, .. a_m)).

    Question:
    When trying to find Qm(X, (a_1, a_2, .. a_m)), is there an existing formula to get Qm(X, (a_1, a_2, .. a_m)), or we need to calculate x_1^a_1 * x_2^a_2 * ... * x_m^a_m for all (x_1, x_2, .. x_m) combinations?

    • + 1 comment

      Hint: What values can the vectors take?

      • + 2 comments

        Pretty much unlimited.

        So what mathematics theorem applies here?

        • + 2 comments

          Given through , optimal values for through can be found in closed form via Lagrange multiplier method https://en.wikipedia.org/wiki/Lagrange_multiplier

          You can find that at optimality, and therefore the optimal value

          • + 2 comments

            We need to maximize

            with regard to the constraint
            which is an affine hyperplane (or some subset), as pointed out in another comment.

            We define the Lagrange function

            and look for critical points. These are the where the gradient of vanishes:
            The second equation keeps the constraint intact, the first equation, for , can be rearranged into
            Summation over yields the equation
            which gives the critical points
            Inserting into we get
            Note that this must not be a maximum or minimum. As in the unconstrained case one must analyze the second order partial derivatives to determine the nature of the critical point . See here.

            • + 1 comment

              Analysis of the hessian is unnecessary. Rephrase the problem as minimization of and note the function is convex in . Lagrange multiplier method gives you the same of course. For convex functions, the derivative is actually a supporting hyperplane. . Choose and on the constraint surface . Then and plugging into the hyperplane inequality, so must be a global minimizer of and hence a global maximizer of .

              • + 0 comments

                Thanks for that hint.

          • + 2 comments

            Thanks for your hint.

            Actually I followed Lagrange Multiplier formula and reached something like this:
            a_1*x_2^a_2 = a_2*x_1^a1
            ..
            a_1*x_m^a_m = a_m*x_1^a1

            And If I substitute all other Xs with x_1 in x_1 + x_2 + .. + x_m = X, I got an higher degree equation with a single variable x_1. And it is hard to solve this equation.

            How do you deduce to x_i = (X/A)^m * a_i^a_i? Is there any other maths method used?

            • + 1 comment

              As we are in full spoiler mode now, I added my derivation. :-)

              PS markisusmarkmark showed we can use LaTeX/MathJaX markup. Only deviation seems \\\ instead of \\.

              PPS The exponent of the first factor is not but .

              • + 1 comment

                Actually not full spoiler since my program doesn't work... I think there is a trick I'm missing.

                • + 1 comment

                  Have you corrected that exponent?

                  • + 0 comments

                    Yes sorry, that was a mistake in my write up but I think it was correct in the code, thank you for the correction

            • + 1 comment

              Thanks, @mvanwoerkom.

              I made a mistake in calculating gradient, it should be:

              ...

              • + 1 comment

                Seems different. I do not know how you arrived there.

                Can you provide the function you used to take the gradient. And regarding what variables did you take the partial derivatives?

                • + 1 comment

                  This is the same as the one you arrived. When you have:

                  It is the same as:

                  Hence:

                  And divide the first equation with others:

                  • + 0 comments

                    OK, I understand now. You used .