Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Dynamic Network Representation Based on Latent Factorization of Tensors - Hao Wu, Xuke Wu, Xin Luo

Dynamic Network Representation Based on Latent Factorization of Tensors (eBook)

, , (Autoren)

eBook Download: PDF
2023 | 1st ed. 2023
VIII, 80 Seiten
Springer Nature Singapore (Verlag)
978-981-19-8934-6 (ISBN)
Systemvoraussetzungen
53,49 inkl. MwSt
(CHF 52,25)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

A dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI). An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes' various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge.

In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency.



Hao Wu received a Ph.D. degree in Computer Science from the University of Chinese Academy of Sciences, Beijing, China, in 2022. He is currently an Associate Professor of Data Science with the College of Computer and Information Science, Southwest University, Chongqing, China. His research interests include big data analytics and tensor methods.

Xuke Wu is currently pursuing a Ph.D. degree from the College of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing, China. His current research interests include data mining and intelligent transportation systems.

Xin Luo received a Ph.D. degree in computer science from Beihang University, Beijing, China, in 2011. He is currently a Professor of Data Science and Computational Intelligence with the College of Computer and Information Science, Southwest University, Chongqing, China. He has authored or coauthored over 200 papers (including over 90 IEEE Transactions papers) in the areas of his interests. His research interests include big data analysis and intelligent control.


A dynamic network is frequently encountered in various real industrial applications, such as the Internet of Things. It is composed of numerous nodes and large-scale dynamic real-time interactions among them, where each node indicates a specified entity, each directed link indicates a real-time interaction, and the strength of an interaction can be quantified as the weight of a link. As the involved nodes increase drastically, it becomes impossible to observe their full interactions at each time slot, making a resultant dynamic network High Dimensional and Incomplete (HDI). An HDI dynamic network with directed and weighted links, despite its HDI nature, contains rich knowledge regarding involved nodes' various behavior patterns. Therefore, it is essential to study how to build efficient and effective representation learning models for acquiring useful knowledge.In this book, we first model a dynamic network into an HDI tensor and present the basic latent factorization of tensors (LFT) model. Then, we propose four representative LFT-based network representation methods. The first method integrates the short-time bias, long-time bias and preprocessing bias to precisely represent the volatility of network data. The second method utilizes a proportion-al-integral-derivative controller to construct an adjusted instance error to achieve a higher convergence rate. The third method considers the non-negativity of fluctuating network data by constraining latent features to be non-negative and incorporating the extended linear bias. The fourth method adopts an alternating direction method of multipliers framework to build a learning model for implementing representation to dynamic networks with high preciseness and efficiency.
Erscheint lt. Verlag 7.3.2023
Reihe/Serie SpringerBriefs in Computer Science
SpringerBriefs in Computer Science
Zusatzinfo VIII, 80 p. 20 illus., 16 illus. in color.
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Datenbanken
Informatik Theorie / Studium Algorithmen
Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Mathematik / Informatik Mathematik Statistik
Schlagworte Big Data • Dynamic network representation • High-dimensional and incomplete tensor • Latent factorization of tensors • representation learning
ISBN-10 981-19-8934-6 / 9811989346
ISBN-13 978-981-19-8934-6 / 9789811989346
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 3,8 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Build memory-efficient cross-platform applications using .NET Core

von Trevoir Williams

eBook Download (2024)
Packt Publishing (Verlag)
CHF 29,30
Learn asynchronous programming by building working examples of …

von Carl Fredrik Samson

eBook Download (2024)
Packt Publishing Limited (Verlag)
CHF 29,30