Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster
Apstrakt
Gaussian Conditional Random Fields (GCRF) are a type of structured regression model that incorporates multiple predictors and multiple graphs. This is achieved by defining quadratic term feature functions in Gaussian canonical form which makes the conditional log-likelihood function convex and hence allows finding the optimal parameters by learning from data. In this work, the parameter space for the GCRF model is extended to facilitate joint modelling of positive and negative influences. This is achieved by restricting the model to a single graph and formulating linear bounds on convexity with respect to the models parameters. In addition, our formulation for the model using one network allows calculating gradients much faster than alternative implementations. Lastly, we extend the model one step farther and incorporate a bias term into our link weight. This bias is solved as part of the convex optimization. Benefits of the proposed model in terms of improved accuracy and speed are ch...aracterized on several synthetic graphs with 2 million links as well as on a hospital admissions prediction task represented as a human disease-symptom similarity network corresponding to more than 35 million hospitalization records in California over 9 years.
Izvor:
30th AAAI Conference on Artificial Intelligence, AAAI 2016, 2016, 1596-1602Izdavač:
- AAAI press
Finansiranje / projekti:
- DARPA [FA9550-12-1-0406]
- NSF BIGDATA grant [14476570]
- ONR [N00014-15-1-2729]
- SNSF Joint Research project (SCOPES) [IZ73Z0_152415]
- Divn Of Social and Economic Sciences
- Direct For Social, Behav & Economic Scie [1659998] Funding Source: National Science Foundation
Institucija/grupa
Fakultet organizacionih naukaTY - CONF AU - Glass, Jesse AU - Ghalwash, Mohamed AU - Vukićević, Milan AU - Obradović, Zoran PY - 2016 UR - https://rfos.fon.bg.ac.rs/handle/123456789/1615 AB - Gaussian Conditional Random Fields (GCRF) are a type of structured regression model that incorporates multiple predictors and multiple graphs. This is achieved by defining quadratic term feature functions in Gaussian canonical form which makes the conditional log-likelihood function convex and hence allows finding the optimal parameters by learning from data. In this work, the parameter space for the GCRF model is extended to facilitate joint modelling of positive and negative influences. This is achieved by restricting the model to a single graph and formulating linear bounds on convexity with respect to the models parameters. In addition, our formulation for the model using one network allows calculating gradients much faster than alternative implementations. Lastly, we extend the model one step farther and incorporate a bias term into our link weight. This bias is solved as part of the convex optimization. Benefits of the proposed model in terms of improved accuracy and speed are characterized on several synthetic graphs with 2 million links as well as on a hospital admissions prediction task represented as a human disease-symptom similarity network corresponding to more than 35 million hospitalization records in California over 9 years. PB - AAAI press C3 - 30th AAAI Conference on Artificial Intelligence, AAAI 2016 T1 - Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster EP - 1602 SP - 1596 UR - conv_3472 ER -
@conference{ author = "Glass, Jesse and Ghalwash, Mohamed and Vukićević, Milan and Obradović, Zoran", year = "2016", abstract = "Gaussian Conditional Random Fields (GCRF) are a type of structured regression model that incorporates multiple predictors and multiple graphs. This is achieved by defining quadratic term feature functions in Gaussian canonical form which makes the conditional log-likelihood function convex and hence allows finding the optimal parameters by learning from data. In this work, the parameter space for the GCRF model is extended to facilitate joint modelling of positive and negative influences. This is achieved by restricting the model to a single graph and formulating linear bounds on convexity with respect to the models parameters. In addition, our formulation for the model using one network allows calculating gradients much faster than alternative implementations. Lastly, we extend the model one step farther and incorporate a bias term into our link weight. This bias is solved as part of the convex optimization. Benefits of the proposed model in terms of improved accuracy and speed are characterized on several synthetic graphs with 2 million links as well as on a hospital admissions prediction task represented as a human disease-symptom similarity network corresponding to more than 35 million hospitalization records in California over 9 years.", publisher = "AAAI press", journal = "30th AAAI Conference on Artificial Intelligence, AAAI 2016", title = "Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster", pages = "1602-1596", url = "conv_3472" }
Glass, J., Ghalwash, M., Vukićević, M.,& Obradović, Z.. (2016). Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster. in 30th AAAI Conference on Artificial Intelligence, AAAI 2016 AAAI press., 1596-1602. conv_3472
Glass J, Ghalwash M, Vukićević M, Obradović Z. Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster. in 30th AAAI Conference on Artificial Intelligence, AAAI 2016. 2016;:1596-1602. conv_3472 .
Glass, Jesse, Ghalwash, Mohamed, Vukićević, Milan, Obradović, Zoran, "Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster" in 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (2016):1596-1602, conv_3472 .