Sufficient Dimension Reduction for Average Causal Effect Estimation

September 14, 2020 Β· Declared Dead Β· πŸ› Data mining and knowledge discovery

πŸ‘» CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Debo Cheng, Jiuyong Li, Lin Liu, Jixue Liu arXiv ID 2009.06444 Category stat.ME Cross-listed cs.AI Citations 16 Venue Data mining and knowledge discovery Last Checked 1 month ago
Abstract
Having a large number of covariates can have a negative impact on the quality of causal effect estimation since confounding adjustment becomes unreliable when the number of covariates is large relative to the samples available. Propensity score is a common way to deal with a large covariate set, but the accuracy of propensity score estimation (normally done by logistic regression) is also challenged by large number of covariates. In this paper, we prove that a large covariate set can be reduced to a lower dimensional representation which captures the complete information for adjustment in causal effect estimation. The theoretical result enables effective data-driven algorithms for causal effect estimation. We develop an algorithm which employs a supervised kernel dimension reduction method to search for a lower dimensional representation for the original covariates, and then utilizes nearest neighbor matching in the reduced covariate space to impute the counterfactual outcomes to avoid large-sized covariate set problem. The proposed algorithm is evaluated on two semi-synthetic and three real-world datasets and the results have demonstrated the effectiveness of the algorithm.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” stat.ME

Died the same way β€” πŸ‘» Ghosted