[{"data":1,"prerenderedAt":44},["ShallowReactive",2],{"article-detail-journal":3,"article-detail:70":18},["Reactive",4],{"title":5,"description":6,"cover_image":7,"overview":8,"issn":9,"publisher":10,"publishing_mode":11,"impact_factor":12,"impact_factor_5year":12,"submission_to_decision_days":13,"downloads":14,"id":15,"created_at":16,"updated_at":17},"World Journal of Young and Excellent Scholars","World Journal of Young and Excellent Scholars (JYES) is a peer-reviewed, open-access, interdisciplinary journal that supports the academic work of pre-collegiate and adolescent researchers. The journal aims to provide a formal and professional platform for young scholars from around the world to publish their original ideas, research findings, and innovative methods. By connecting secondary education with the broader academic community, JYES helps young researchers take part in scholarly communication at an early stage. The journal is committed to promoting academic excellence through rigorous peer review, high ethical standards, and broad international visibility.\n\nJYES welcomes high-quality submissions that show clear research questions, critical thinking, and strong scientific methods. The journal covers a wide range of STEM fields, including natural sciences, engineering, mathematics, and computer science. Through its open-access model, all published articles are freely available to readers worldwide, helping young scholars gain greater exposure and recognition for their work.","\u002Fimages\u002Fuploads\u002Fjournals\u002F1\u002Febcb34b6ba1e45e09273bbfd391101ab_cover image.png","\u003Cp>\u003Cstrong>Journal Title: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>World Journal of Young and Excellent Scholars\u003C\u002Fp>\u003Cp>\u003Cstrong>Journal Type:\u003C\u002Fstrong> \u003C\u002Fp>\u003Cp>Peer-reviewed, open-access, interdisciplinary journal\u003C\u002Fp>\u003Cp>\u003Cstrong>Aim and Mission:\u003C\u002Fstrong> \u003C\u002Fp>\u003Cp style=\"text-align: justify;\">World Journal of Young and Excellent Scholars (JYES) is dedicated to supporting and promoting the scholarship of young researchers, especially pre-collegiate and adolescent scholars. Its mission is to provide a professional publication venue where young investigators can share intellectual discoveries, develop academic confidence, and receive global recognition through a rigorous scholarly process.\u003C\u002Fp>\u003Cp>\u003Cstrong>Scope:\u003C\u002Fstrong> \u003C\u002Fp>\u003Cp>The journal publishes research across the full range of STEM disciplines, including:\u003C\u002Fp>\u003Cp>•Natural Sciences: Physics, Chemistry, Biology, and Environmental Science.\u003C\u002Fp>\u003Cp>•Engineering: Mechanical, Electrical, Civil, and Material Engineering.\u003C\u002Fp>\u003Cp>•Mathematics: Pure and Applied Mathematics, Statistics.\u003C\u002Fp>\u003Cp>•Computer Science: Artificial Intelligence, Data Science, Algorithms, and Software Engineering.\u003C\u002Fp>\u003Cp>\u003Cstrong>Article Types: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>Original Research, Reviews, Short Communications.\u003C\u002Fp>\u003Cp>\u003Cstrong>Open Access Policy: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>JYES follows a fully open-access publishing model. All articles are freely accessible to readers worldwide without subscription or paywall restrictions.\u003C\u002Fp>\u003Cp>\u003Cstrong>Copyright and License: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp style=\"text-align: justify;\">Authors retain the copyright to their work. All published papers are distributed under the Creative Commons Attribution 4.0 International License (CC BY 4.0), which allows unrestricted use, sharing, and reproduction, provided the original source is properly cited.\u003C\u002Fp>\u003Cp>\u003Cstrong>Publisher: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>Association of Global Intelligent Science and Technology (AGIST)\u003C\u002Fp>\u003Cp>\u003Cstrong>Publication Frequency: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>Quarterly (4 issues per year)\u003C\u002Fp>\u003Cp>\u003Cstrong>Peer Review and Publication Timeline:\u003C\u002Fstrong>\u003C\u002Fp>\u003Cp>Submission to First Decision: approximately 30 days\u003C\u002Fp>\u003Cp>Submission to Final Acceptance: approximately 70 days\u003C\u002Fp>\u003Cp>Acceptance to Publication: approximately 20 days\u003C\u002Fp>\u003Cp>\u003Cstrong>Publication Frequency: \u003C\u002Fstrong>\u003C\u002Fp>\u003Cp style=\"text-align: justify;\">JYES is supported by AGIST and maintains full editorial independence. All editorial decisions and the double-blind peer-review process are based only on academic quality, scientific validity, and ethical publishing standards, without commercial influence.\u003C\u002Fp>","","AGIST","hybrid",3,90,0,1,"2026-03-18T08:15:34.180691Z","2026-04-08T02:11:52.603854Z",{"id":19,"title":20,"abstract":21,"type":22,"doi":-1,"keywords":23,"authors":27,"author_ids":-1,"issue":34,"page_start":39,"page_end":35,"view_count":40,"download_count":41,"published_date":37,"created_at":42,"funds":43},70,"EcoFormer: A Sparse Transformer Framework for Predictive Energy Optimization in Cloud Data Centers","The rapid expansion of cloud computing has led to an exponential increase in energy consumption within data centers, posing significant environmental and economic challenges. Effective resource provisioning relies heavily on accurate workload forecasting to enable proactive auto-scaling. However, modern cloud workloads exhibit extreme non-stationarity and complex temporal dependencies that overwhelm traditional methods like ARIMA or standard RNNs. While Transformers have shown promise in sequence modeling, their quadratic computational complexity $\\mathcal{O}(L^2)$ hinders real-time deployment for long-term forecasting. In this paper, we propose EcoFormer, a resource-efficient time-series forecasting framework. EcoFormer introduces a Probabilistic Sparse Attention mechanism that selects only the most dominant queries, reducing complexity to $\\mathcal{O}(L \\log L)$. Furthermore, we incorporate a Green-Regularized Loss that explicitly penalizes over-provisioning during idle periods. We provide a theoretical bound on the approximation error of our sparse attention matrix. Extensive experiments on the Google Cluster Trace dataset demonstrate that EcoFormer reduces Mean Absolute Error (MAE) by 14.5\\% compared to standard Transformers and achieves an estimated energy saving of 18.2\\% in simulated auto-scaling scenarios.","regular",[24,25,26],"AI","时间序列预测","稀疏注意力",[28],{"id":29,"display_name":30,"first_name":30,"middle_name":-1,"last_name":9,"orcid":-1,"avatar":-1,"email":31,"affiliation":-1,"bio":-1,"created_at":9,"updated_at":9,"affiliations":32,"articles":33},20,"Bohan Zhang","bh.zhang3767@outlook.com",[],[],{"id":35,"volume_number":15,"issue_number":15,"title":-1,"cover_image":36,"publish_date":37,"is_current":38},21,"\u002Fimages\u002Fuploads\u002Fdocuments\u002F79b5a3ffefad4d29addde963b5f8e7e4_iusse1.png","2026-03-28",true,15,231,9,"2026-04-01T07:38:10.343943Z",[],1775646837932]