Based on $n$ independent observations distributed according to some probability measure on $\R^d$ with compactly supported continuous Lebesgue density $p$, we develop plug-in density support estimators which adapt simultaneously to the local smoothness of the density $p$ near the unknown support boundary and to its margin exponent. For this aim, some new scheme for locally adaptive bandwidth selection is proposed which sensitively shrinks the bandwidth of a kernel estimator near the unknown support boundary. In case of a H\"older continuous density to the exponent $\beta>0$, this locally minimax-optimal bandwidth is smaller than the usual rate $n^{-1/(2\beta+d)}$, even in case of homogeneous smoothness of $p$. Besides the classical minimax risk bounds at some fixed point, new pointwise risk bounds for this adaptive density estimator along a shrinking neighborhood of the support boundary are derived, which demonstrate its superiority in speed of convergence as compared to classical adaptive estimators. With this improved density estimator, we construct plug-in support estimators with data-driven off-set. They are shown to be minimax-optimal with respect to the Lebesgue measure of the symmetric difference of sets, up to some logarithmic factor.