Hi all,
I have implemented a new expression to implement a mollifier function of the form:
$$ f(x) = \sum_i \exp( -\frac{1}{l^2} \| x - x_i \|^2_C ) $$
where $l$ is a correlation length, $C$ is s.p.d. matrix (metric) and x_i is a given set of points.
I wrote the following code which works correctly, but it is extremely slow (it takes several minutes to interpolate $f(x)$ on a linear finite element space on a 64-by-64 structured grid).
Is this because of the python implementation? Are there big performances losses when subclassing from Expression?
Do you have any suggestions how to speed it up without having to resort to a c++ implementation?
Also do you have any suggestion on how to implement $\| x - x_i\|_C^2 = (x-x_i)^T C (x - x_i)$ as matrix vector vectors multiplication and inner products?
class Mollifier(dl.Expression):
def __init__(self, l, metric, locations):
self.l2 = l*l
self.metric = metric
self.nloc = locations.shape[0]
self.ndim = locations.shape[1]
self.locations = locations
def eval(self, value, x):
assert self.ndim == 2
value[0] = 0.
for ip in range(self.nloc):
dx = np.ndarray( self.ndim )
for idim in range(self.ndim):
dx[idim] = x[idim] - self.locations[ip,idim]
e = dx[0]*dx[0]*self.metric[0,0] + dx[1]*dx[1]*self.metric[1,1] + 2.*dx[0]*dx[1]*self.metric[0,1]
value[0] = value[0] + dl.exp(-e/self.l2)