Now that with F# (and .NET 4.0 in general) we have support for big integers through the
BigInteger type, Project Euler problem 13 has become a trivial one:
Work out the first ten digits of the sum of the following one-hundred 50-digit numbers.
Solution (note that I have removed most of the input numbers to improve readability):
#light open System.Linq open System.Numerics let numbers = [|37107287533902102798797998220837590246510135740250I; 46376937677490009712648124896970078050417018260538I; 74324986199524741059474233309513058123726617309629I; ... 53503534226472524250874054075591789781264330331690I|] let solve_problem_13 () = numbers .Aggregate(0I, fun (x:BigInteger) (y:BigInteger) -> x + y) .ToString().Substring(0, 10)
Using Linq, it's easy to compute the solution with a simple aggregation. Note that I couldn't use the
Sum function directly because it does not have an overload for
Without support for such big numbers, one could have reasoned that, in order to calculate the first 10 digits of the result, only the first 11 digits of each number are relevant. With this intuition, the problem would have been tractable using simple
Problem 13 is a classical problem that can be resolved using parallel aggregation: it can be decomposed into many independent sub-tasks (each calculating the sum of 10 numbers, as an example) and the final result can be obtained aggregating the results from each of them.
AsParallel, scaling this problem on multiple cores is trivial:
let solve_problem_13 () = numbers .AsParallel() // MAKE IT PARALLEL! .Aggregate(0I, fun (x:BigInteger) (y:BigInteger) -> x + y) .ToString().Substring(0, 10)
Of course, with such a trivial problem, you should not expect big performance improvements with a parallel version (indeed, the overhead of splitting the work into different threads might usually lengthen the time it takes to get to the result).