I recently came across the following problem: how do you work out the Fisher information for a parameter when the likelihood function, say , is continuous, but not everywhere differentiable with respect to the parameter. The standard formula for the Fisher information

assumes regularity conditions which no longer hold, and hence is not applicable. After hunting around with Google for some time, I came across the following (freely downloadable) paper

H. E. Daniels

The Asymptotic Efficiency of a Maximum Likelihood Estimator

Fourth Berkeley Symp. on Math. Statist. and Prob., University of California Press, 1961, 1, 151-163

which, fortunately, had exactly what I need. It turns out, we can still compute the Fisher information, without existence of second derivatives, using

provided a set of weaker conditions holds. To get my head around the issue, I decided to look at a simple problem of working out the Fisher information for the mean of a Laplace distribution

The log-likelihood is now given by

The Fisher information for the scale parameter can be obtained in a similar manner. The first derivative with respect to is

Since,

the Fisher information for the scale parameter is

The paper by Daniels details the exact conditions needed for the Fisher information to be derived in this way. Note, there is a mistake in one of the proofs, the correction is detailed in

J. A. Williamson

A Note on the Proof by H. E. Daniels of the Asymptotic Efficiency of a Maximum Likelihood Estimator

Biometrika, 1984, 71, 651-653

Unfortunately, the Williamson paper requires a JSTOR subscription for download.