Journal Information
IEEE Transactions on Network Science and Engineering (TNSE)
https://www.comsoc.org/publications/journals/ieee-tnseImpact Factor: |
7.9 |
Publisher: |
IEEE |
ISSN: |
2334-329X |
Viewed: |
27193 |
Tracked: |
13 |
Call For Papers
The IEEE Transactions on Network Science and Engineering is committed to timely publishing of peer-reviewed technical articles that deal with the theory and applications of network science and the interconnections among the elements in a system that form a network. In particular, the IEEE Transactions on Network Science and Engineering publishes articles on understanding, prediction, and control of structures and behaviors of networks at the fundamental level. The types of networks covered include physical or engineered networks, information networks, biological networks, semantic networks, economic networks, social networks, and ecological networks. Aimed at discovering common principles that govern network structures, network functionalities and behaviors of networks, the journal seeks articles on understanding, prediction, and control of structures and behaviors of networks. Another trans-disciplinary focus of the IEEE Transactions on Network Science and Engineering is the interactions between and co-evolution of different genres of networks. The core topics covered include: Network Sampling and Measurement; Learning of Network Topology; Modeling and Estimation of Network Dynamics; Network Inference; Models of Complex Networks; Modeling of Network Evolution; Network Design; Consensus, Synchronization and Control of Complex Networks; Interactions between and Co-evolution of Different Genres of Networks; Community Formation and Detection; Complex Network Robustness and Vulnerability; Network Interdependency and Cascading Failures; Searching in Complex Networks; Information Diffusion and Propagation; Percolation and Diffusion on Networks; Epidemiology in Complex Systems.
Last updated by Dou Sun in 2025-09-26
Special Issues
Special Issue on Next Generation Blockchain: AI-Empowered OptimizationSubmission Date: 2025-11-01Blockchain technology has evolved through distinct generations, from Bitcoin's pioneering first generation focusing on cryptocurrency transactions, to Ethereum's second generation enabling smart contracts, to the current third generation addressing scalability and cross-chain interoperability. The current trend of research on blockchain focuses on addressing critical challenges of scalability and cross-chain interoperability. Meanwhile, emerging next generation concepts are beginning to take shape, introducing quantum resistance, enhanced privacy preservation, and AI-empowered optimizations. As artificial intelligence continues its rapid advancement, AI-driven optimization presents promising solutions to blockchain's persistent challenges. AI-empowered approaches could significantly improve scalability and seamless cross-chain communication, representing a crucial technological convergence. This integration is particularly vital as blockchain technology serves as the decentralized trust foundation for emerging digital frontiers, including Web3.0 ecosystems, NFT marketplaces, decentralized finance (DeFi) applications, and the deployment of large-scale AI models. The synthesis of AI and blockchain technologies not only addresses current limitations but also paves the way for more robust, efficient, and interconnected decentralized systems that can support the next generation of innovations. AI-empowered optimization methods are already beginning to address multiple challenges within blockchain technology. For example, Federated Learning (FL) enables different blockchain networks to collaborate in training cross-chain transaction verification models, thereby enhancing verification efficiency while safeguarding the data privacy of each individual chain. Self-supervised Learning (SSL) can be used to analyze transaction patterns and detect anomalies from historical data. By pre-training tasks such as transaction sequence prediction, the temporal features of transactions can be learned, and based on the learned representations, abnormal transactions can be more effectively detected. Large Language Models (LLMs) can be utilized to generate secure smart contract code and identify vulnerabilities within the code. Fine-tuned LLMs can also be employed to comprehend the protocol specifications of different blockchains, aiding in the development of cross-chain communication protocols, generating protocol conversion code, and verifying the correctness of the protocols. Generative AI (GAI) is being used to generate adversarial samples for testing smart contracts and identifying previously unknown vulnerabilities. In addition, GAI is employed to generate simulated transaction data to obscure real transactions, thereby enhancing user privacy. Furthermore, GAI also can be used to facilitate the rapid translation of contract languages to accommodate different blockchain protocols, improving interoperability. Deep Reinforcement Learning (DRL) is leveraged to optimize node load distribution in blockchain networks, improve node allocation in sharding, and increase throughput. These advancements will collectively enable blockchain systems to address both current and future needs, offering high throughput, low latency, enhanced security, and seamless cross-platform integration while preparing for emerging technological paradigms. The potential technologies to realize these advancements include but are not limited to: AI Empowered Consensus Mechanisms: AI is being used to improve consensus mechanisms by enhancing decision-making processes, optimizing node selection, and increasing efficiency in achieving consensus, thereby reducing latency and resource consumption. AI Empowered Blockchain Scalability Solutions: AI-driven optimization techniques, such as machine learning algorithms for predictive transaction processing, can enhance blockchain scalability by reducing transaction validation times, as well as predicting network congestion. Additionally, AI can enable adaptive shard partitioning and node allocation, ensuring efficient distribution of resources across the network. This allows blockchains to handle higher transaction volumes with lower latency, improved throughput, and optimal resource usage, ensuring sustained performance even during peak usage. AI Enhanced Privacy: AI contributes to privacy preservation by generating synthetic transaction data to protect user identities, enhancing methods like zero-knowledge proofs to ensure confidential information remains private, and improving the privacy of covert communications within blockchain networks. AI-driven techniques can optimize the masking of transaction metadata and improve the anonymity of participants, ensuring that sensitive data remains protected even in complex, decentralized systems. AI Empowered Network Security: AI techniques enable blockchain networks to identify anomalous traffic patterns and suspicious activities, such as unusual transaction behaviors, enhancing the network's robustness. AI-driven anomaly detection systems can also provide early warning responses to potential security threats, enabling proactive mitigation of risks and strengthening overall network security. AI Empowered Smart Contract Security: AI is leveraged to detect vulnerabilities in smart contracts, predict potential security threats, and automate the process of auditing and verifying code to prevent exploits and enhance overall contract safety. AI Empowered Cross-chain Interoperability: AI facilitates seamless communication and transactions between different blockchain networks by enhancing protocol compatibility, automating the translation of data, and optimizing cross-chain interactions. While next-generation blockchain systems offer tremendous potential, their successful adoption depends on overcoming critical challenges through innovative research and technological advancements. This Special Issue seeks to explore the latest breakthroughs in AI-empowered blockchain technologies that can propel the evolution of next-generation systems. It focuses on addressing key issues such as scalability, security, privacy, and interoperability, aiming to enable the development of AI-driven, enterprise-ready blockchain solutions for the future. Potential topics of interest include but are not limited to the following: Reinforcement Learning (RL) for dynamic node selection and consensus optimization Generative Adversarial Networks (GANs) for simulating consensus scenarios AI-driven technologies for predicting consensus decisions and reducing latency Machine Learning (ML) Algorithms for predictive transaction validation and congestion management Adaptive sharding using AI to optimize blockchain resource distribution Deep Reinforcement Learning (DRL) for throughput optimization and network load balancing Generative AI (GAI) to generate synthetic transaction data for privacy protection Anomaly detection using machine learning for identifying suspicious activities in blockchain networks Predictive modeling with AI to forecast and mitigate network attacks (e.g., DDoS, Sybil Attack) AI methods for adaptive security measures in response to evolving threats on blockchain networks AI-driven Approaches (e.g., Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), Graph Neural Networks (GNNs)) for Discovering Smart Contract Vulnerabilities and Improving Security Large Language Models (LLMs) for auditing and verifying smart contract code automatically Natural Language Processing (NLP) for automatic translation of cross-chain protocols Graph Neural Networks (GNNs) for optimizing cross-chain transaction routing AI-Enhanced Regulatory Compliance for Blockchain Networks Applications integrating AI and blockchain Advanced technologies for security and privacy in blockchain AI-empowered Blockchain for B5G/6G Networks AI-empowered Blockchain for Non-Terrestrial Networks (NTNs) AI-empowered Blockchain for Intelligent Transportation Systems (ITS) AI-empowered Consensus and Governance Mechanisms for Blockchain Networks Submission Guidelines Prospective authors are invited to submit their manuscripts electronically, adhering to the IEEE Transactions on Network Science and Engineering guidelines. Note that the page limit is the same as that of regular papers. Please submit your papers through the online system and be sure to select the special issue of "Advance Technologies for Next Generation Blockchain". Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts, to the ScholarOne portal. If requested, abstracts should be sent by e-mail directly to the Guest Editors. Important Dates Manuscript Submission Deadline: 1 November 2025 Initial Decision Date: 1 February 2026 Revised Manuscript Due: 1 March 2026 Final Decision Date: 1 April 2026 Final Manuscript Due: 15 April 2026 Publication Date: 2026
Last updated by Dou Sun in 2025-09-26
Special Issue on Recent Advances in Extremely-High-Frequency Wireless NetworksSubmission Date: 2025-11-15The rapid advancement of next-generation applications has driven an unprecedented demand for ultra-high data rates, ultra-low latency, and high-precision sensing. To meet these requirements, the wireless research community is actively exploring extremely-high-frequency (EHF) communication and sensing technologies, leveraging the ultra-wide bandwidth of millimeter-wave (mmWave) and terahertz (THz) signals. Despite their potential, EHF signals suffer from severe path loss and molecular absorption, which restrict their communication range. Additionally, they are highly susceptible to blockages, necessitating advanced beam management techniques to ensure reliable connectivity. Hardware limitations, including the efficiency constraints of power amplifiers (PAs) and non-ideal behaviors of analog-to-digital/digital-to-analog converters (ADCs/DACs), pose further challenges. Last but not least, dense EHF small-cell networks can overwhelm centralized controllers, thus requiring new, distributed, and hierarchical network architectures to handle complexity. The tight coupling between physical-layer conditions (e.g., link blockage) and network-layer decisions also demands new cross-layer designs. Fortunately, recent advancements in EHF wireless technologies have addressed many of these issues. Researchers have developed low-complexity precoding methods and new beam alignment techniques to enhance beam management performance. EHF wireless networks have also spurred innovations in high-precision imaging, localization, and environmental mapping, aligning with the vision of ubiquitous sensing-communication integration in 6G networks. New reconfigurable antenna systems have emerged and offered more cost-effective and flexible beamforming solutions compared to conventional large-scale antenna arrays. On the hardware front, progress in PA, ADC/DAC, and frequency multiplier designs has facilitated long-range, high-speed data transmission and high-resolution real-time video streaming. Standardization efforts have also gained momentum, with IEEE 802.15.3d supporting 100 Gbps links in the 252–325 GHz range, and ITU-R Working Party 5D exploring IMT above 100 GHz. From the network-layer perspective, advanced routing protocols that account for directional links and link intermittency has been proposed to avoid deafness and reduce alignment delay in EHF networks. Machine learning models (e.g., reinforcement learning, GNNs, LLMs) predict link quality, node mobility, and blockage to optimize routing in EHF wireless networks. Given these recent developments, this special issue aims to focus on the algorithm design, hardware innovations, protocol development, and standardization efforts for EHF wireless networks. Prospective authors are invited to submit original manuscripts on topics including but not limited to: Energy-efficient design and implementation of EHF wireless networks Channel measurement and modeling for EHF wireless networks ML/AI for EHF wireless networks MAC layer protocols for EHF wireless networks Hierarchical computing and caching techniques in EHF wireless networks Security and privacy issues in EHF wireless networks Performance analysis for EHF wireless networks Channel estimation for EHF wireless networks New framework and architecture of EHF wireless networks Advanced optimization theories and algorithms for EHF wireless networks Next-generation multiple access for EHF wireless networks Secure and covert communication in EHF wireless networks Near-field communications and sensing in EHF wireless networks Cell-free EHF wireless networks Non-terrestrial EHF wireless networks Artificial intelligence-enabled EHF wireless networks Over-the-air computing in EHF wireless networks Advanced localization in EHF wireless networks Sensing and imaging in EHF wireless networks Integrated sensing and communications in EHF wireless networks Experimental demonstrations, prototyping, hardware designs, and field-tests for EHF wireless networks Standardization activities for EHF wireless networks Other advanced technologies for EHF wireless networks, e.g., IRS/RIS, UAV, OTFS, movable/fluid/pinching antennas, etc. Submission Guidelines Prospective authors are invited to submit their manuscripts electronically, adhering to the IEEE Transactions on Network Science and Engineering guidelines. Note that the page limit is the same as that of regular papers. Please submit your papers through the online system and be sure to select the special issue or special section name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts, to the ScholarOne portal. If requested, abstracts should be sent by e-mail directly to the Guest Editors. Important Dates Paper Submission: 15 November 2025 Initial Decision Date: 15 February 2026 Revised Manuscript Due: 15 March 2026 Final Decision Date: 15 April 2026 Final Manuscript Due: 1 May 2026 Publication Date: Second Quarter 2026
Last updated by Dou Sun in 2025-09-26
Special Issue on Semantic Communications with Intellicise (Intelligent and Concise) NetworkingSubmission Date: 2025-12-01In recent years, intellicise (intelligent and concise) networks have played a crucial role by transforming conventional networking paradigms, which emphasize the use of native intelligence and endogenous simplified architecture to achieve a state of self-evolution, self-optimization, and self-balance. In this context, unlike traditional data-centric networking that focuses on raw bit-level transmission, semantic communications with intellicise networking extract, transmit, and process high-level contextual representations. Therefore, semantic communications with intellicise networking enhances contextual adaptability and network scalability in dynamic and large-scale environments. Despite its potential, semantic communications with intellicise networking face several critical challenges. To meet the fluctuating demands of modern networks, network management must not only be adaptive but also context-aware, dynamically adjusting resource distribution and traffic flows based on semantic importance to minimize latency and enhance efficiency. Furthermore, ensuring privacy-preserving communication, robust security mechanisms, and effective trust management is crucial in heterogeneous, large-scale, and dynamic network environments, where only contextually relevant information should be transmitted. In addition, the interplay between semantic communication and intellicise network structures remains an open research challenge. The evolution of semantic-aware traffic can impact network topology, routing protocols, and congestion control, necessitating new methodologies for network optimization. Latency-aware compression techniques are essential for understanding the large-scale deployment of semantic communication. Moreover, defining new performance indicators, such as semantic fidelity, effective bandwidth, and information transfer efficiency, will allow for better quantification of communication efficiency in networked environments. This special issue seeks to advance these areas by exploring novel architectures, methodologies, and applications. We welcome original high-quality submissions on topics including, but not limited to: Semantic communications with next-generation intellicise networking AI-driven wireless communications with intellicise networking Lightweight and efficient semantic communications with intellicise networking Semantic knowledge base construction and deployment with intellicise networking Adaptive and context-aware network management for semantic communications MoE (Mixture of Experts)-based adaptive encoder and decoder selection for semantic communications with intellicise networking Agentic AI-driven resource allocation optimization for semantic communications with intellicise networking Token communications and context-based generative semantic communications Generative AI (GAI) enabled semantic communications with intellicise networking Latency-aware semantic compression and inference for intellicise networking Security and trust in large-scale semantic networks, including DDoS resilience Semantic-aware congestion control, node discovery, and routing protocols New performance indicators for semantic communication with intellicise networking Experimental validation and testbed for semantic communications with intellicise networking Real-world case studies of semantic communications with intellicise networking Submission Guidelines Prospective authors are invited to submit their manuscripts electronically, adhering to the IEEE Transactions on Network Science and Engineering guidelines. Note that the page limit is the same as that of regular papers. Please submit your papers through the online system and be sure to select the special issue or special section name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts, to the ScholarOne portal. If requested, abstracts should be sent by e-mail directly to the Guest Editors. Important Dates Manuscript Submission Deadline: 1 December 2025 Initial Decision Date: 1 March 2026 Revised Manuscript Due: 1 April 2026 Final Decision Date: 1 May 2026 Final Manuscript Due: 15 May 2026 Publication Date: Second Quarter 2026
Last updated by Dou Sun in 2025-09-26
Special Issue on Advanced Application of Graph Representation Learning in Communication NetworksSubmission Date: 2026-01-01The rapid advancement of emerging communication technologies, such as flexible antennas and integrated sensing and communication, has driven the evolution of communication networks toward both high-dimensional resource utilization and multifunctional integration. This evolution poses challenges in efficiently designing communication networks to satisfy the quality-of-service and time-sensitive requirements of mobile applications in dynamic environments. In view of the notable successes of deep learning (DL) in numerous application fields, applying DL to enhance network intelligence has become an inevitable trend. DL can be integrated into communication networks across almost all layers to enhance resource utilization efficiency and network security. Furthermore, DL serves as a key enabler that gives rise to innovative wireless applications, such as semantic communications. The task-specific characteristic of DL necessitates the customization of learning strategies and neural networks for communication networks. Communication networks comprise core components that naturally form a graph structure, such as wireless topologies and routing patterns. Therefore, graph representation learning (GRL) has emerged as a powerful DL tool to re-design communication networks. Particularly, graph neural networks (GNNs) have been increasingly recognized as one fundamental model for graph-structured communication networks. GNNs not only augment the extraction of features over the network topology but also facilitate scalable and distributed computation. GRL/GNNs can be integrated with existing wireless DL approaches as a network feature extractor. For instance, the superiority of GNNs over other neural networks has been widely demonstrated in addressing wireless optimization problems following the “learning-to-optimize” paradigm to enable real-time and scalable inference of (near-)optimal resource allocation. GRL/GNNs shift traditional iterative optimization methods to data-driven and model-based strategies, introducing unprecedented flexibility in addressing complex problems. Exploring the advanced application of GRL/GNNs in upcoming 6G can support a variety of fundamental and practical communication designs and optimization schemes, including but not limited to Real-time Optimization: GRL/GNNs can automatically learn to approximate the optimal mapping from network topologies to targeted resource allocation strategies, routing results, etc., thereby enabling real-time computation. Scalable and Distributed Implementation: GRL/GNNs are scalable to elements in communication networks, enabling signal processing in ultra-dense and dynamic wireless environments. The scalability also facilitates distributed implementation by integrating the signaling exchange of wireless networks with the message-passing mechanism of GNNs. Unified Solution Framework: Learning on a graph modeling a specific network topology can extract critical topological features. The features can be harnessed to tackle diverse problems via a unified framework. Fine-tuning pre-trained GNNs via transfer learning can further enhance the solving performance. Complex Problem Solver: Data-driven DL offers a transformative paradigm to tackle complex communication problems through GRL/GNNs. This novel paradigm for constructing problem solvers balances the trade-off between performance and efficiency of heuristic algorithms. Wireless Intelligence Module: By casting GNNs as a network topology-aware feature extraction module, they can be seamlessly integrated into other wireless DL architectures (such as wireless generative artificial intelligence (GAI) and wireless large language models (LLMs)) to jointly support wireless intelligence. While GRL/GNNs can realize intelligent signal processing, there remains a pressing need for advanced model architectures and effective mechanisms to enhance the learning performance and tackle the challenges such as guaranteeing feasible solutions, capturing heterogeneous relationships for diverse wireless services, achieving robust information transmission, enabling efficient deployment strategies, etc. Besides, the lack of standardized datasets, task definitions, and unified evaluation metrics poses challenges to the reproducibility of experimental procedures in DL-enabled signal processing. Therefore, an in-depth exploration of the connection between the DL approach and the communication tasks is urgently required for the application of GRL/GNNs in communication networks. The integration of DL and communication networks demonstrates a cutting-edge field with revolutionary potential in future mobile communication systems. This special issue is dedicated to exploring the breakthroughs in the advanced applications of GRL in communication networks from both the theoretical and practical perspectives, in order to provide a more comprehensive understanding of the philosophy of DL-enabled communication networks. Topics of interest include, but are not limited to: Theoretical fundamentals of the application of GRL/GNNs in communication networks Performance analysis of GRL/GNN-enabled communication networks Distributed implementation strategies GRL/GNN-based communication schemes Datasets and evaluation metrics for GRL/GNN-enabled communication networks Integration of over-the-air computation and GNNs Training strategies, activation functions, and loss functions for GNN-based signal processing Heterogeneous GNNs for communication networks LLMs and GAI with GRL in communication networks GRL/GNNs for multifunctional wireless services, e.g., integrated sensing and communication, physical-layer security, and simultaneous wireless information and power transfer Novel neural architectures of GNNs for communication networks GRL/GNN-enabled semantic communications Deep reinforcement learning based GNNs in dynamic communication environments, e.g., unmanned aerial vehicle assisted communication systems and space-air-ground integrated networks Application of GRL/GNNs in channel estimation and signal detection GRL/GNN-enabled channel coding GRL/GNN-enabled multiple access schemes GRL/GNN-enabled routing algorithms and protocols GRL/GNN-based designs for flexible antennas, e.g., mobile antenna, fluid antenna system, pinching antenna, and reconfigurable intelligent surface GRL/GNN-enabled mobile edge computing and federated learning strategies Integration of GRL with quantum computing and quantum machine learning in communication networks The prototypes and experimental studies about the GRL/GNN-based communication networks Submission Guidelines Prospective authors are invited to submit their manuscripts electronically, adhering to the IEEE Transactions on Network Science and Engineering guidelines. Note that the page limit is the same as that of regular papers. Please submit your papers through the online system and be sure to select the special issue or special section name. Manuscripts should not be published or currently submitted for publication elsewhere. Please submit only full papers intended for review, not abstracts, to the ScholarOne portal. If requested, abstracts should be sent by e-mail directly to the Guest Editors. Important Dates Manuscript Submission Deadline: 1 January 2026 Initial Decision Date: 1 April 2026 Revised Manuscript Due: 1 May 2026 Final Decision Date: 1 June 2026 Final Manuscript Due: 15 June 2026 Publication Date: Second Quarter 2026
Last updated by Dou Sun in 2025-09-26
Best Papers
Year | Best Papers |
---|---|
2019 | Network Maximal Correlation |
Related Journals
CCF | Full Name | Impact Factor | Publisher | ISSN |
---|---|---|---|---|
International Journal of UbicComp | AIRCC | 0976-2213 | ||
Journal of Information Science | 1.800 | SAGE | 0165-5515 | |
Language Learning & Technology | 3.800 | University of Hawaii Press | 1094-3501 | |
Journal of Cloud Computing | Springer | 2192-113X | ||
Fuzzy Information and Engineering | Taylor & Francis | 1616-8658 | ||
Numerical Heat Transfer, Part B: Fundamentals | 1.700 | Taylor & Francis | 1040-7790 | |
Journal of Cultural Heritage | 3.500 | Elsevier | 1296-2074 | |
b | Journal of Computer Science and Technology | 1.200 | Springer | 1000-9000 |
c | Expert Systems | 3.000 | John Wiley & Sons | 1468-0394 |
a | ACM Transactions on Software Engineering and Methodology | 6.2 | ACM | 1049-331x |
Full Name | Impact Factor | Publisher |
---|---|---|
International Journal of UbicComp | AIRCC | |
Journal of Information Science | 1.800 | SAGE |
Language Learning & Technology | 3.800 | University of Hawaii Press |
Journal of Cloud Computing | Springer | |
Fuzzy Information and Engineering | Taylor & Francis | |
Numerical Heat Transfer, Part B: Fundamentals | 1.700 | Taylor & Francis |
Journal of Cultural Heritage | 3.500 | Elsevier |
Journal of Computer Science and Technology | 1.200 | Springer |
Expert Systems | 3.000 | John Wiley & Sons |
ACM Transactions on Software Engineering and Methodology | 6.2 | ACM |
Related Conferences
Short | Full Name | Conference |
---|---|---|
ICMTMA | International Conference on Measuring Technology and Mechatronics Automation | 2020-01-04 |
ICMSNT | International Conference on Materials Science and Nanotechnology | 2018-03-29 |
CoMSE | International Conference on Materials Science and Engineering | 2024-09-13 |
DMA | International Conference on Data Mining and Applications | 2022-05-28 |
SIUSAI | International Symposium on Intelligent Unmanned Systems and Artificial Intelligence | 2025-08-15 |
MDEE | International Conference on Mechanical Design and Electronic Engineering | 2017-08-18 |
IWFS | International Workshop on Fuzzy Systems | 2020-12-16 |
IOLTS | IEEE International Symposium on On-Line Testing and Robust System Design | 2025-07-07 |
ICHI | IEEE International Conference on Healthcare Informatics | 2024-06-03 |
ICRAIE | International Conference and Workshops on Recent Advances and Innovations in Engineering | 2022-12-01 |