Springer Handbook of Science and Technology Indicators (eBook)
XXXVIII, 1103 Seiten
Springer International Publishing (Verlag)
978-3-030-02511-3 (ISBN)
This handbook presents the state of the art of quantitative methods and models to understand and assess the science and technology system. Focusing on various aspects of the development and application of indicators derived from data on scholarly publications, patents and electronic communications, the individual chapters, written by leading experts, discuss theoretical and methodological issues, illustrate applications, highlight their policy context and relevance, and point to future research directions.
A substantial portion of the book is dedicated to detailed descriptions and analyses of data sources, presenting both traditional and advanced approaches. It addresses the main bibliographic metrics and indexes, such as the journal impact factor and the h-index, as well as altmetric and webometric indicators and science mapping techniques on different levels of aggregation and in the context of their value for the assessment of research performance as well as their impact on research policy and society. It also presents and critically discusses various national research evaluation systems.
Complementing the sections reflecting on the science system, the technology section includes multiple chapters that explain different aspects of patent statistics, patent classification and database search methods to retrieve patent-related information. In addition, it examines the relevance of trademarks and standards as additional technological indicators.
The Springer Handbook of Science and Technology Indicators is an invaluable resource for practitioners, scientists and policy makers wanting a systematic and thorough analysis of the potential and limitations of the various approaches to assess research and research performance.
Wolfgang Glänzel is Director of the Centre for R&D Monitoring (ECOOM) and full Professor at KU Leuven, Belgium. He worked in several bibliometric projects above all for the European Commission and several governments. Wolfgang Glänzel is Editor-in-Chief of the international journal Scientometrics, Academic Editor of the journal PLoS One and Secretary-Treasurer of the International Society for Scientometrics and Informetrics (ISSI). He has published in probability theory and mathematical statistics, computer science and various topics in quantitative science studies, scientometrics and informetrics. His main topics of research are the development and application of bibliometric indicators to various levels of aggregation, detection and analysis of cognitive structures of scientific research, subject delineation and classification and mathematical models for the processes of scholarly communication. In 1999 he received the international Derek deSolla Price Award for outstanding contributions to the quantitative studies of science.
Henk F. Moed was a senior staff member and a full professor of research assessment methodologies at the Centre for Science and Technology Studies (CWTS) at Leiden University, The Netherlands, between 1981 and 2010. He obtained a Ph.D. degree in Science Studies at the University of Leiden in 1989. He has been active in numerous research areas, including the creation of bibliometric databases from raw data from Thomson Scientific's Web of Science and Elsevier's Scopus, analysis of inaccuracies in citation matching, assessment of the potentialities and pitfalls of journal impact factors, the development and application of science indicators for the measurement of research performance, multi-dimensional assessment of research impact and the potential of altmetrics. He is currently an independent scientific advisor, and visiting professor at the Sapienza University of Rome.
Ulrich Schmoch obtained a diploma in mechanical engineering in 1977 and a Ph.D. degree in social sciences in 1983. Between 1983 and 1985 he worked at the office of a patent attorney. Since 1986, he is at the Fraunhofer Institute for Systems and Innovation Research, Karlsruhe, Germany, and held the position of deputy head of department and head of department for many years. He was visiting fellow at the University of Stellenbosch, South Africa, in 2011, and was delegated to the University of Speyer, Germany, as head of a study course on science management between 2012 and 2014. At Fraunhofer, he conducted several dozens of research projects on behalf of diverse national governments, the European Commission, the OECD, the WIPO, various foundations and enterprises. His special research areas are all kinds of innovation indicators and their combination, in particular patent, publication, trade mark, foreign trade and production indicators.
Mike Thelwall is Professor of Information Science and leader of the Statistical Cybermetrics Research Group at the University of Wolverhampton, UK, which he joined in 1989. His PhD was in Pure Mathematics from the University of Lancaster. His current research field includes identifying and analysing web phenomena using quantitative-led research methods, primarily link analysis and sentiment analysis, and has pioneered an information science approach to link analysis. Mike has developed a wide range of tools for gathering and analysing web data, including hyperlink analysis, sentiment analysis and content analysis for Twitter, YouTube, MySpace, blogs and the web in general. He was a member of the UK's independent review of the role of metrics in research assessment (2014-15) and is a member of the UK Forum for Responsible Research Metrics.
Preface 6
Editors’ Introduction 7
About the Editors 15
List of Authors 17
Contents 21
List of Abbreviations 32
Part A Analysis of Data Sources and Network Analysis 37
1 The Journal Impact Factor: A Brief History, Critique, and Discussion of Adverse Effects 39
1.1 Origins of the Journal Impact Factor 39
1.2 Calculation and Reproduction 41
1.3 Critiques 42
1.4 Systemic Effects 50
1.5 What Are the Alternatives? 54
1.6 The Future of Journal Impact Indicators 55
References 56
2 Bibliometric Delineation of Scientific Fields 61
2.1 Shaping the Landscape of Scientific Fields 61
2.2 Context 62
2.3 Tools: Information Retrieval (IR) and Bibliometrics 71
2.4 Multiple Networks and Hybridization 84
2.5 Delineation Schemes and Conclusion 91
References 95
3 Knowledge Integration: Its Meaning and Measurement 105
3.1 Interdisciplinarity 106
3.2 Definitions 106
3.3 Drivers and Arguments in Favor of Interdisciplinary Research 108
3.4 Different Aspects of Interdisciplinary Work 109
3.5 Quantitative Measures: Introduction 110
3.6 Structural Approach 111
3.7 IDR in the Research Landscape 112
3.8 Concrete Measurements 112
3.9 Entropy is not the Same as Diversity or Interdisciplinarity 114
3.10 The Rafols–Meyer Framework 115
3.11 Knowledge Diffusion as the Mirror Image of Knowledge Integration 116
3.12 Other Network Measures 117
3.13 Evaluating Interdisciplinary Work 118
3.14 Does Interdisciplinary Research Have More Impact? 118
3.15 Measuring Cognitive Distance 119
3.16 Identification of Interdisciplinary Ideas 121
3.17 Time Aspects 121
3.18 Limitations of Existing Approaches 122
3.19 An Example Within the Rafols–Meyer Framework 122
3.20 Conclusions and Suggestions for Further Research 125
References 126
4 Google Scholar as a Data Source for Research Assessment 131
4.1 The Origins of Google Scholar 131
4.2 Basic Functioning of Google Scholar 133
4.3 Radiographing a Big Data Bibliographic Source 138
4.4 Google Scholar's Data for Scientometric Analyses 155
4.5 The Expanded Academic World of Google Scholar 157
4.6 Final Remarks 159
References 161
5 Disentangling Gold Open Access 164
5.1 Open Access and Scholarly Communication 164
5.2 What is Open Access? 165
5.3 Disentangling Gold Open Access 167
5.4 Conclusions and Future Prospects 175
References 177
6 Science Forecasts: Modeling and Communicating Developments in Science, Technology, and Innovation 180
6.1 Models and Visualizations 180
6.2 Models and Modeling 181
6.3 Modeling Science 182
6.4 Exemplary Models of Science 184
6.5 Challenges 185
6.6 Insights and Opportunities 187
6.7 Outlook 190
References 190
7 Science Mapping Analysis Software Tools: A Review 193
7.1 Science Mapping Analysis 193
7.2 Bibliographic Networks 195
7.3 Science Mapping Software 196
7.4 Software Characteristics: Summary and Comparison 213
7.5 Conclusions 214
References 215
8 Creation and Analysis of Large-Scale Bibliometric Networks 220
8.1 Fundamentals and Scope 220
8.2 Background 221
8.3 Studies of Large-Scale Bibliometric Networks 230
8.4 The STS Global Model of Science 237
8.5 Summary and Implications 242
References 243
9 Science Mapping and the Identification of Topics: Theoretical and Methodological Considerations 246
9.1 General Drivers for Advancement of Science Mapping 246
9.2 Creation of Document Networks 248
9.3 Techniques for Community Detection 255
9.4 Methodological Constraints 258
9.5 Local Versus Global Applications 259
9.6 Conclusions 263
References 263
Part B Advancement of Methodology for Research Assessment 267
10 Measuring Science: Basic Principles and Application of Advanced Bibliometrics 269
10.1 A Short History of Scientometrics 270
10.2 Bibliometric Analysis: Rationale, Practical Needs, Basics 274
10.3 Practical Application of Research Performance Indicators 285
10.4 What Is a Bibliometric Science Map? 298
10.5 Can Science Be Measured? 303
References 304
11 Field Normalization of Scientometric Indicators 313
11.1 Background 313
11.2 What Is Field Normalization? 314
11.3 Field Classification Systems 315
11.4 Overview of Field-Normalized Indicators 317
11.5 Evaluation of Field-Normalized Indicators 321
11.6 How Much Difference Does It Make in Practice? 323
11.7 Conclusion 326
References 328
12 All Along the h-Index-Related Literature: A Guided Tour 333
12.1 h-Index Basics 334
12.2 A General Overview of the Literature on the h-Index 335
12.3 Compiling h-Index Bibliographies from Various Bibliographic Databases 337
12.4 A Bibliometric Overview of the h-Index Literature 340
12.5 Application of the h-Index Concept Within and Outside the Realm of Bibliometrics 347
12.6 Mathematical Models of the h-Index 352
12.7 Closing Remarks 357
12.A Appendix 334
12.B Appendix 335
References 361
13 Citation Classes: A Distribution-based Approach for Evaluative Purposes 367
13.1 General Introduction: The Need for Multilevel Profiling of Citation Impact 368
13.2 The Method of Characteristic Scores and Scales (CSS) 371
13.3 Characteristic Scores and Scales in Research Assessment 373
13.4 Characteristic Scores and Scales in New Environments? Some Future Perspectives 389
13.A Appendix 368
References 390
14 An Overview of Author-Level Indicators of Research Performance 393
14.1 A Brief Introduction to Author-Level Indicators 393
14.2 Brief Review: Trends in Indicator Development 395
14.3 General Characteristics of Author-Level Indicators 398
14.4 Schematizing the Indicators 406
14.5 The Appropriateness of ALIRP and the Application Context 419
14.6 Conclusions 420
14.A Appendix 393
References 422
15 Challenges, Approaches and Solutions in Data Integration for Research and Innovation 429
15.1 The Role of Data Integration for Research and Innovation 429
15.2 The Problem of Data Integration and Data Governance 432
15.3 Formal Framework for OBDI 434
15.4 Sapientia and OBDI for Multidimensional Research Assessment 438
15.5 Reasoning over Sapientia: Some Illustrative Examples 442
15.6 Conclusions 449
References 451
16 Synergy in Innovation Systems Measured as Redundancy in Triple Helix Relations 453
16.1 The Triple Helix Model of Innovations 453
16.2 Institutional and Evolutionary TH Models 454
16.3 The Operationalization of the Triple Helix 458
16.4 The Generation of Redundancy 460
16.5 The Triple Helix Indicator of Mutual Redundancy 460
16.6 The Measurement 462
16.7 Measuring the Knowledge Base of Innovation Systems 463
16.8 Institutional Retention 467
16.9 Concluding Remarks 468
16.A Appendix: Comparison Among Country Studies in Terms of the Main Results 453
16.B Appendix: Comparison Among Country Studies in Terms of the Data 454
References 470
Part C Science Systems and Research Policy 476
17 Scientometrics Shaping Science Policy and vice versa, the ECOOM Case 478
17.1 Scientometrics and Science Policy, a Symbiotic Relationship 478
17.2 ECOOM: An Instrument Linking Science Policy and Scientometrics in Flanders 480
17.3 ECOOM: Mapping and Benchmarking Science Activities in Flanders 482
17.4 ECOOM: Input for Funding Formulas of Science Activities in Flanders 485
17.5 ECOOM: No Data and No Indicators Without a Solid IT Backbone 488
17.6 Insights Obtained 492
References 494
18 Different Processes, Similar Results? A Comparison of Performance Assessment in Three Countries 496
18.1 Background 497
18.2 Research Assessment in the United Kingdom 498
18.3 Research Assessment in Australia 499
18.4 Research Assessment in Germany 500
18.5 Comparing What is Assessed in Each System 502
18.6 Comparing the Role of Metrics in Each System 503
18.7 Data and Methods 505
18.8 Analysis of Bibliometric Data 506
18.9 Discussion and Conclusions 513
References 513
19 Scientific Collaboration Among BRICS: Trends and Priority Areas 516
19.1 BRICS: From Origin to Priority Areas in ST& I
19.2 Methodology 518
19.3 Results 519
19.4 Discussion and Final Remarks 532
References 534
20 The Relevance of National Journals from a Chinese Perspective 536
20.1 Journal Evaluation 538
20.2 Development of STM Journals in China and Demand for Evaluation 541
20.3 Comparative Study of International and National Evaluation Systems of Academic Journals in China 544
20.4 Comparative Study of International and National Evaluation Indicators of Academic Journals in China 557
20.5 China's STM Journals: The Development of the Boom Index and its Monitoring Function 561
20.6 The Definition and Application of Comprehensive Performance Scores (CPS) for Chinese Scientific and Technical Journals 574
20.7 Evaluation of English-Language Science and Technology Journals in China 578
References 590
21 Bibliometric Studies on Gender Disparities in Science 594
21.1 Background 594
21.2 Gender Determination 596
21.3 Definitions 597
21.4 Research Approach 598
21.5 Data Collection and Datasets Used 599
21.6 Methodology 599
21.7 Productivity 600
21.8 Research Performance 602
21.9 Impact and Visibility 603
21.10 Careers: Recruitment and Promotions 605
21.11 Summary 606
References 607
22 How Biomedical Research Can Inform Both Clinicians and the General Public 612
22.1 Study Objectives 613
22.2 Methodology 617
22.3 Results: Clinical Practice Guidelines 621
22.4 Results: Newspaper Stories 628
22.5 Discussion 632
22.A Appendix 613
References 637
23 Societal Impact Measurement of Research Papers 639
23.1 Definition of Societal Impact as Well as Reasons for and Problems with the Measurement 641
23.2 Societal Impact Considerations in Evaluative Practice 645
23.3 Case Studies and Quantitative Indicators 648
23.4 Altmetrics 652
23.5 Discussion 656
References 658
24 Econometric Approaches to the Measurement of Research Productivity 663
24.1 Assessing the Productivity of Research 664
24.2 What Do We Measure? 665
24.3 Research Assessment in the Current Time and the Need for a Framework 671
24.4 Economics and Econometrics in the Current Time 677
24.5 What We Could Learn from Economics and Management 678
24.6 Methodological Challenges in the Assessment of Productivity/Efficiency of Research 681
24.7 Potential of Econometric Approaches and of Nonparametric Methods 686
24.8 Conclusions 690
References 690
25 Developing Current Research Information Systems (CRIS) as Data Sources for Studies of Research 697
25.1 Current Research Information Systems 697
25.2 The Need for Top-Down Coordination 699
25.3 Towards Internationally Integrated CRIS 700
25.4 Commercial Solutions to CRIS 702
25.5 Agreeing on Sharing Well-Defined Data 702
25.6 Testing Real Data Sharing in the Social Sciences and Humanities 703
25.7 Subject Classification 704
25.8 Dynamic Registers of Evaluated Scholarly Publication Channels 704
25.9 Ensuring Comprehensiveness of Data in a CRIS 705
25.10 Ensuring the Quality and Consistency of Data in CRIS 706
25.11 Examples of Studies of Research Based on CRIS Data 707
25.12 Conclusions 710
References 711
Part D New Indicators for Research Assessment 714
26 Social Media Metrics for New Research Evaluation 716
26.1 Social Media Metrics and Altmetrics 716
26.2 Research Evaluation: Principles, Frameworks, and Challenges 717
26.3 Social Media Data and Indicators 720
26.4 Conceptualizing Social Media Metrics for Research Evaluation and Management 723
26.5 Data Issues and Dependencies of Social Media Metrics 725
26.6 Conceptualizing Applications of Social Media Metrics for Research Evaluation and Management 725
26.7 Prospects for Social Media Metrics in Research Evaluation 734
26.8 Concluding Remarks 737
References 738
27 Reviewing, Indicating, and Counting Books for Modern Research Evaluation Systems 743
27.1 Evaluating Scholarly Books 743
27.2 The Monitors 744
27.3 The Subject Classifiers 746
27.4 The Indexers 747
27.5 The Indicator Constructionists 748
27.6 Integrating Book Metrics into Evaluation Practices 751
References 752
28 Scholarly Twitter Metrics 757
28.1 Tweets as Measures of Impact 757
28.2 Twitter in Scholarly Communication 758
28.3 Scholarly Output on Twitter 767
28.4 Conclusion and Outlook 781
References 782
29 Readership Data and Research Impact 789
29.1 Introduction and Overview 789
29.2 Reading Research: Background and Terminology 790
29.3 Readership Data from Libraries 791
29.4 Research Impact Assessment 791
29.5 Online Access and Download Data 793
29.6 Readership Data from Online Reference Managers 795
29.7 Usage Data from Academic Social Network Sites 802
29.8 Summary 802
References 802
30 Data Collection from the Web for Informetric Purposes 808
30.1 Background 808
30.2 Early Studies 809
30.3 Applying Bibliometric Laws to Data Retrieved from the Web 810
30.4 Longitudinal Studies 810
30.5 Search Engine Reliability and Validity 811
30.6 Data Cleansing 813
30.7 Link Analysis 813
30.8 Bibliometric Citations Versus Web References 815
30.9 Google Scholar 816
30.10 Additional Google Sources 819
30.11 Microsoft Academic 821
30.12 Subject Specific and Institutional Repositories 821
30.13 Altmetrics 822
30.14 A Wish-List for Future Data Collection from the Web 823
References 824
31 Web Citation Indicators for Wider Impact Assessment of Articles 828
31.1 Web as a Citation Source 828
31.2 Sources of Web Citations: Websites and Document Genres 829
31.3 Web Citation Indicators for Journals 836
31.4 Types of Web Citation Impacts 836
31.5 Web Citation Searching 838
31.6 Correlations Between Web Citation Indicators and Citation Counts for Academic Articles 839
31.7 Limitations of Web Citation Analysis 840
31.8 Conclusions 841
References 842
32 Usage Bibliometrics as a Tool to Measure Research Activity 846
32.1 Previous Studies and Scope 846
32.2 Definition of Terminology 847
32.3 Usage and Research Activity 850
32.4 Traditional Indicators 856
32.5 Discussion 857
32.6 Concluding Remarks 859
References 860
33 Online Indicators for Non-Standard Academic Outputs 862
33.1 Non-Standard Academic Outputs 862
33.2 Core Concepts 865
33.3 Research Outputs for Applications 867
33.4 Multimedia Outputs 869
33.5 Websites 872
33.6 Documentary Outputs 874
33.7 Reputation 876
33.8 Summary: The Importance of Context 876
References 877
Part E Advancement of Methodology for Patent Analysis 884
34 Information Technology-Based Patent Retrieval Models 886
34.1 Patent Retrieval Versus Information Retrieval 887
34.2 Boolean Retrieval Model 890
34.3 Basic Patent Retrieval Model 890
34.4 Enhancements and Extensions to the Basic Patent Retrieval Model 893
34.5 Dynamic Patent Retrieval Models 898
34.6 Conclusions 900
References 900
35 The Role of the Patent Attorney in the Filing Process 902
35.1 Starting Points 902
35.2 Literature Review and Regulations 904
35.3 Basic Research Questions 905
35.4 Descriptive Results 907
35.5 Multivariate Results 911
35.6 Summarizing Discussion 913
References 914
36 Exploiting Images for Patent Search 916
36.1 How Patent Document Analysis Evolved 916
36.2 Patent Search Scenario and Requirements 917
36.3 Feature Extraction 918
36.4 Content-Based Patent Image Retrieval 919
36.5 Concept-Based Patent Image Retrieval 925
36.6 Conclusion 931
References 932
37 Methodological Challenges for Creating Accurate Patent Indicators 934
37.1 New Methodological Issues 934
37.2 International Patent Flows 934
37.3 Costs of Patent Applications 938
37.4 Patent Applications to Foreign Countries 939
37.5 International Country Comparisons 941
37.6 Effectiveness of Keyword Searches 943
37.7 Features of the Cooperative Patent Classification 944
37.8 Patents of Large Companies 947
37.9 Patent Value 949
37.10 The Impact of Legal Changes on Statistics 952
37.11 Conclusion 952
References 952
38 Using Text Mining Algorithms for Patent Documents and Publications 955
38.1 Text Mining and Science and Technology Studies 955
38.2 Practical Text Mining Procedure 957
38.3 Specific Text Mining Models 959
38.4 Document Similarity: Validation Studies 961
38.5 Clustering and Topic Modeling Case Studies 972
38.6 Conclusions, Discussion, Limitations, and Directions for Further Research 980
References 980
39 Application of Text-Analytics in Quantitative Study of Science and Technology 983
39.1 Background 983
39.2 Literature Review on the Application of Text Mining 984
39.3 Case Studies 994
39.4 Discussion and Conclusion 1002
References 1003
40 Functional Patent Classification 1009
40.1 Patent Classifications 1010
40.2 A Brief History of Functional Analysis 1011
40.3 Patent Search and the Limitations of Existing Patent Classifications 1016
40.4 Functional Patent Classification: Three Case Studies 1019
40.5 Conclusions and Future Research 1025
References 1026
Part F Patent System, Patents and Economics 1030
41 Computer-Implemented Inventions in Europe 1032
41.1 Starting Points 1032
41.2 A Brief Introduction to the Economics of Intellectual Property Rights 1033
41.3 Patentability of Computer Programs— Historical Developments and the Status Quo 1036
41.4 Definition and Operationalization of Computer-Implemented Inventions 1037
41.5 Empirical Trends in CII Filings 1040
41.6 Summary and Implications 1044
References 1044
42 Interplay of Patents and Trademarks as Tools in Economic Competition 1048
42.1 Pattern of R& D-Intensive Enterprises
42.2 The Approach to Studying the Interplay of Patents and Trademarks 1049
42.3 Empirical Basis of the Analysis 1050
42.4 Assessment of Indicators 1050
42.5 Conclusions 1057
References 1058
43 Post Catch-up Trajectories: Publishing and Patenting Activities of China and Korea 1061
43.1 Background 1061
43.2 Conceptual Framework and Data 1064
43.3 Findings and Discussion 1066
43.4 Conclusion 1077
References 1077
44 Standardization and Standards as Science and Innovation Indicators 1080
44.1 Background 1080
44.2 Definitions and Processes 1081
44.3 Current Opportunities 1081
44.4 Future Challenges 1085
44.5 Relevance for Decision Makers in Industry and Policy 1087
References 1088
Detailed Contents 1092
Subject Index 1114
Erscheint lt. Verlag | 30.10.2019 |
---|---|
Reihe/Serie | Springer Handbooks | Springer Handbooks |
Zusatzinfo | XXXVIII, 1103 p. 279 illus. in color. |
Sprache | englisch |
Themenwelt | Geisteswissenschaften ► Philosophie ► Ethik |
Technik | |
Wirtschaft ► Betriebswirtschaft / Management ► Logistik / Produktion | |
Wirtschaft ► Volkswirtschaftslehre | |
Schlagworte | Altmetrics • Bibliometircs • bibliometric indicators • Citation analysis • Impact Factor • Informetrics • network analysis • patent analysis • patent classification systems • Research Assessment • research ethics • science indicators • Science Mapping • Scientometric indicators • Scientometrics • Social Media Metrics • web-based indicators • Webometrics |
ISBN-10 | 3-030-02511-X / 303002511X |
ISBN-13 | 978-3-030-02511-3 / 9783030025113 |
Haben Sie eine Frage zum Produkt? |
Größe: 32,2 MB
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich