Write your message


Volume 21, Issue 1 (1-2025)                   ijmt 2025, 21(1): 71-79 | Back to browse issues page

XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Khorsandi P, Hajivand A. AI-Driven Ship Resistance Prediction Using Three Key Hydrodynamic Parameters. ijmt 2025; 21 (1) :71-79
URL: http://ijmt.ir/article-1-869-en.html
1- Khorramshahr University of Marine Science and Technology
2- Khorramshahr Univeristy of Marine Science and Technology
Abstract:   (86 Views)
This paper introduces an innovative, AI-driven methodology for predicting ship resistance using only three fundamental input parameters: Length Between Waterlines (LWL), Beam at Waterline (BWL), and Draft (T). Traditional resistance prediction techniques such as empirical methods, towing tank experiments, and computational fluid dynamics (CFD) simulations are highly accurate but involve significant time, cost, and complexity. Our approach leverages machine learning algorithms, including XGBoost, CatBoost, and Gradient Boosting, to derive a comprehensive suite of hydrodynamic characteristics from a robust dataset comprising 308 full-scale experiments across 22 different hull shapes. The methodology begins with meticulous data preprocessing and feature engineering, including normalization, outlier analysis, and correlation assessment, to ensure reliability and minimize error propagation. By transforming raw hydrodynamic data into dimensionless groups, our models effectively capture both linear and non-linear relationships among critical parameters such as displacement, wetted surface area, midship section area, waterplane area, and the longitudinal center of buoyancy (LCB). Simple linear regression techniques were successfully used to derive parameters with perfect correlations, while more complex non-linear interactions were accurately predicted using advanced ensemble methods. The integration of these AI models into a Django-based web application further enhances the utility of our approach, providing naval architects and marine engineers with a user-friendly, real-time tool for design optimization and performance evaluation. Comparative analysis indicates that our streamlined model delivers predictions of residual and frictional resistance with accuracy comparable to traditional methods, while offering significant improvements in computational efficiency and cost-effectiveness. Overall, this research bridges the gap between classical hydrodynamic theory and modern artificial intelligence techniques, offering a rapid, reliable, and scalable solution for ship resistance prediction that has the potential to significantly enhance early-stage design processes in naval architecture.
Full-Text [PDF 715 kb]   (20 Downloads)    

Type of Study: Research Paper | Subject: Ship Hydrodynamic
Received: 2025/05/13 | Accepted: 2025/07/11

References
1. Holtrop, J., and Mennen, G. G. J., (1982), An approximate power prediction method, International Shipbuilding Progress, 29(335), p.166-170. [DOI:10.3233/ISP-1982-2933501]
2. ITTC, (2017), Practical guidelines for ship resistance tests, International Towing Tank Conference.
3. Molland, A. F., Turning, S. and Forbes, P., (2010), Principles of Naval Architecture, Society of Naval Architects and Marine Engineers.
4. Blevins, R. D., (2014), Applied Fluid Dynamics Handbook, Krieger Publishing Company.
5. Larsson, L., Stern, F. and Visonneau, M. (Eds.), (2014), Numerical ship hydrodynamics: An assessment of the Gothenburg 2010 workshop, Springer. [DOI:10.1007/978-94-007-7189-5]
6. Panda, J. P., (2021), Machine Learning for Naval Architecture, Ocean and Marine Engineering, arXiv:2109.05574 (CC BY 4.0).
7. Gerritsma, J., Onnmk, R. and Versluis, A., (1981), Geometry, Resistance and Stability of the Delft Systematic Yacht Hull Series, Delft University of Technology. [DOI:10.3233/ISP-1981-2832801]
8. Chen, T., and Guestrin, C., (2016), XGBoost: A scalable tree boosting system, Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, p.785-794. [DOI:10.1145/2939672.2939785]
9. Dorogush, A. V., Gulin, A., Kazeev, V. and Prokhorenkova, L., (2018), CatBoost: gradient boosting with categorical features support, arXiv preprint.
10. Freund, Y., and Schapire, R. E., (1997), A decision-theoretic generalization of on-line learning and an application to boosting, Journal of Computer and System Sciences, 55(1), p.119-139. [DOI:10.1006/jcss.1997.1504]
11. Django Software Foundation, (2023), Django: A high-level Python web framework.
12. Harris, C. R., et al., (2020), Array programming with NumPy, Nature, 585(7825), p.357-362. [DOI:10.1038/s41586-020-2649-2] [PMID] []
13. McKinney, W., (2010), Data structures for statistical computing in Python, Proceedings of the 9th Python in Science Conference, p.51-56. [DOI:10.25080/Majora-92bf1922-00a] [PMID]
14. Pedregosa, F., et al., (2011), Scikit-learn: Machine learning in Python, Journal of Machine Learning Research, 12, p.2825-2830.
15. Tukey, J. W., (1977), Exploratory Data Analysis, Addison-Wesley.
16. Witten, I. H., Frank, E., Hall, M. A. and Pal, C. J., (2016), Data Mining: Practical Machine Learning Tools and Techniques (4th ed.), Morgan Kaufmann.
17. Pearson, K., (1895), Notes on regression and inheritance in the case of two parents, Proceedings of the Royal Society of London, 58, p.240-242. [DOI:10.1098/rspl.1895.0041]
18. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., ... and Liu, T. Y., (2017), LightGBM: A highly efficient gradient boosting decision tree, Advances in Neural Information Processing Systems, 30, p.3146-3154.
19. Breiman, L., (2001), Random forests, Machine Learning, 45(1), p.5-32. [DOI:10.1023/A:1010933404324]
20. Cortes, C., and Vapnik, V., (1995), Support-vector networks, Machine Learning, 20(3), p.273-297. [DOI:10.1023/A:1022627411411]
21. Kohavi, R., (1995), A study of cross-validation and bootstrap for accuracy estimation and model selection, Proceedings of the 14th International Joint Conference on Artificial Intelligence, 2(12), p.1137-1143.
22. Bergstra, J., and Bengio, Y., (2012), Random search for hyper-parameter optimization, Journal of Machine Learning Research, 13(Feb), p.281-305.
23. Friedman, J. H., (2001), Greedy Function Approximation: A Gradient Boosting Machine, The Annals of Statistics, 29(5), p.1189-1232. [DOI:10.1214/aos/1013203451]
24. Géron, A., (2019), Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow (2nd ed.), O'Reilly Media. ISBN: 978-1492032649.

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License
International Journal of Maritime Technology is licensed under a

Creative Commons Attribution-NonCommercial 4.0 International License.