I would like to calculate the error in some approximated Function compared to an exact solution. When defining an Expression to represent an exact solution, the keyword degree can be set. However, once the Expression has been constructed, I can't find a way to set it...
Take for example:
from dolfin import *
parameters["krylov_solver"]["relative_tolerance"] = 1e-15
mesh = UnitSquareMesh(1,1)
V = FunctionSpace(mesh, "CG", 1) # Linear space
# Do not set degree
exact = Expression(("x[0]*x[0]")) # Quadratic function
approx = interpolate(exact, V)
print assemble(inner(approx-exact,approx-exact)*dx) # Understandable that this is zero
# Set degree in construction
exact = Expression(("x[0]*x[0]"), degree=2)
print assemble(inner(approx-exact,approx-exact)*dx) # Expect this to to be 1/60
# Set degree after construction...
exact = Expression(("x[0]*x[0]"))
exact.degree = 2 # Would be nice if this works
print assemble(inner(approx-exact,approx-exact)*dx) # Is zero, should be 1/60?
Basically, I want to know the "best practice" on computing the error between an arbitrary Expression and a Function. Does it require an expensive projection of the Expression to a higher-order FunctionSpace.
Anybody got any ideas?