1. Holtrop, J., and Mennen, G. G. J., (1982), An approximate power prediction method, International Shipbuilding Progress, 29(335), p.166-170. [
DOI:10.3233/ISP-1982-2933501]
2. ITTC, (2017), Practical guidelines for ship resistance tests, International Towing Tank Conference.
3. Molland, A. F., Turning, S. and Forbes, P., (2010), Principles of Naval Architecture, Society of Naval Architects and Marine Engineers.
4. Blevins, R. D., (2014), Applied Fluid Dynamics Handbook, Krieger Publishing Company.
5. Larsson, L., Stern, F. and Visonneau, M. (Eds.), (2014), Numerical ship hydrodynamics: An assessment of the Gothenburg 2010 workshop, Springer. [
DOI:10.1007/978-94-007-7189-5]
6. Panda, J. P., (2021), Machine Learning for Naval Architecture, Ocean and Marine Engineering, arXiv:2109.05574 (CC BY 4.0).
7. Gerritsma, J., Onnmk, R. and Versluis, A., (1981), Geometry, Resistance and Stability of the Delft Systematic Yacht Hull Series, Delft University of Technology. [
DOI:10.3233/ISP-1981-2832801]
8. Chen, T., and Guestrin, C., (2016), XGBoost: A scalable tree boosting system, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, p.785-794. [
DOI:10.1145/2939672.2939785]
9. Dorogush, A. V., Gulin, A., Kazeev, V. and Prokhorenkova, L., (2018), CatBoost: gradient boosting with categorical features support, arXiv preprint.
10. Freund, Y., and Schapire, R. E., (1997), A decision-theoretic generalization of on-line learning and an application to boosting, Journal of Computer and System Sciences, 55(1), p.119-139. [
DOI:10.1006/jcss.1997.1504]
11. Django Software Foundation, (2023), Django: A high-level Python web framework.
12. Harris, C. R., et al., (2020), Array programming with NumPy, Nature, 585(7825), p.357-362. [
DOI:10.1038/s41586-020-2649-2] [
PMID] [
]
13. McKinney, W., (2010), Data structures for statistical computing in Python, Proceedings of the 9th Python in Science Conference, p.51-56. [
DOI:10.25080/Majora-92bf1922-00a] [
PMID]
14. Pedregosa, F., et al., (2011), Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12, p.2825-2830.
15. Tukey, J. W., (1977), Exploratory Data Analysis, Addison-Wesley.
16. Witten, I. H., Frank, E., Hall, M. A. and Pal, C. J., (2016), Data Mining: Practical Machine Learning Tools and Techniques (4th ed.), Morgan Kaufmann.
17. Pearson, K., (1895), Notes on regression and inheritance in the case of two parents, Proceedings of the Royal Society of London, 58, p.240-242. [
DOI:10.1098/rspl.1895.0041]
18. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., ... and Liu, T. Y., (2017), LightGBM: A highly efficient gradient boosting decision tree, Advances in Neural Information Processing Systems, 30, p.3146-3154.
19. Breiman, L., (2001), Random forests, Machine Learning, 45(1), p.5-32. [
DOI:10.1023/A:1010933404324]
20. Cortes, C., and Vapnik, V., (1995), Support-vector networks, Machine Learning, 20(3), p.273-297. [
DOI:10.1023/A:1022627411411]
21. Kohavi, R., (1995), A study of cross-validation and bootstrap for accuracy estimation and model selection, Proceedings of the 14th International Joint Conference on Artificial Intelligence, 2(12), p.1137-1143.
22. Bergstra, J., and Bengio, Y., (2012), Random search for hyper-parameter optimization, Journal of Machine Learning Research, 13(Feb), p.281-305.
23. Friedman, J. H., (2001), Greedy Function Approximation: A Gradient Boosting Machine, The Annals of Statistics, 29(5), p.1189-1232. [
DOI:10.1214/aos/1013203451]
24. Géron, A., (2019), Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow (2nd ed.), O'Reilly Media. ISBN: 978-1492032649.