Annals of Telecommunications

An international journal publishing original peer-reviewed papers

Special issue | The role of telecommunications in electronic voting

Vol. 71, n° 7-8, July-August 2016
Content available on Springerlink

Guest editors

J Paul Gibson, Télécom SudParis, France
Robert Krimmer, Tallinn University of Technology, Estonia
Vanessa Teague, University of Melbourne, Australia
Julia Pomares, CIPPEC, Argentina

 

Editorial introduction

A review of E-voting: the past, present and future

J Paul Gibson1, Robert Krimmer2, Vanessa Teague3, Julia Pomares4

(1) Télécom SudParis, France
(2) Tallinn University of Technology, Estonia
(3) University of Melbourne, Australia
(4) CIPPEC, Argentina

 

Crowdsourced integrity verification of election results
An experience from Brazilian elections

Diego F. Aranha, Helder Ribeiro, André Luis Ogando Paraense

University of Campinas, Brazil

Abstract In this work, we describe an experiment for evaluating the integrity of election results, and improving transparency and voter participation in electronic elections. The idea was based on two aspects: distributed collection of poll tape pictures, taken by voters using mobile devices; and crowdsourced comparison of these pictures with the partial electronic results published by the electoral authority. The solution allowed voters to verify if results were correctly transmitted to the central tabulator without manipulation, with granularity of individual polling places. We present results, discuss limitations of the approach and future perspectives, considering the context of the previous Brazilian presidential elections of 2014, where the proposed solution was employed for the first time. In particular, with the aid of our project, voters were able to verify 1.6 % of the total poll tapes, amounting to 4.1 % of the total votes, which prompted the electoral authority to announce improved support for automated verification in the next elections. While the case study relies on the typical workflow of a paperless DRE-based election, the approach can be improved and adapted to other types of voting technology.

Keywords Crowdsourcing – Transparency – Verification of election results

 

An experiment on the security of the Norwegian electronic voting protocol

Kristian Gjøsteen, Anders Smedstuen Lund

NTNU, Trondheim, Norway

Abstract Even when using a provably secure voting protocol, an election authority cannot argue convincingly that no attack that changed the election outcome has occurred, unless the voters are able to use the voting protocol correctly. We describe one statistical method that, if the assumptions underlying the protocol’s security proof hold, could provide convincing evidence that no attack occurred for the Norwegian Internet voting protocol (or other similar voting protocols). To determine the statistical power of this method, we need to estimate the rate at which voters detect possible attacks against the voting protocol. We designed and carried out an experiment to estimate this rate. We describe the experiment and results in full. Based on the results, we estimate upper and lower bounds for the detection rate. We also discuss some limitations of the practical experiment.

Keywords Usability experiment – Attack detection – Electronic voting

 

An investigation into the usability of electronic voting systems for complex elections

Jurlind Budurushi1, Karen Renaud2, Melanie Volkamer1, Marcel Woide1

(1) Technische Universität Darmstadt, Germany
(2) University of Glasgow, UK

Abstract Many studies on electronic voting evaluate their usability in the context of simple elections. Complex elections, which take place in many European countries, also merit attention. The complexity of the voting process, as well as that of the tallying and verification of the ballots, makes usability even more crucial in this context. Complex elections, both paper-based and electronic, challenge voters and electoral officials to an unusual extent. In this work, we present two studies of an electronic voting system that is tailored to the needs of complex elections. In the first study, we evaluate the effectiveness of the ballot design with respect to motivating voters to verify their ballot. Furthermore, we identify factors that motivate voters to verify, or not to verify, their ballot. The second study also addresses the effectiveness of the ballot design in terms of verification, but this time from the electoral officials’ perspective. Last, but not least, we evaluate the usability of the implemented EasyVote prototype from both the voter and electoral official perspectives. In both studies, we were able to improve effectiveness, without impacting efficiency and satisfaction. Despite these usability improvements, it became clear that voters who trusted the electronic system were unlikely to verify their ballots. Moreover, these voters failed to detect the “fraudulent” manipulations. It is clear that well-formulated interventions are required in order to encourage verification and to improve the detection of errors or fraudulent attempts.

Keywords Electronic voting – Usability – Verification – Paper audit trails – Complex elections

 

Receipt-free remote electronic elections with everlasting privacy

Philipp Locher1,2, Rolf Haenni1

(1) Bern University of Applied Sciences, Switzerland
(2) University of Fribourg, Switzerland

Abstract We present a new cryptographic voting protocol for remote electronic voting that offers three of the most challenging features of such protocols: verifiability, everlasting privacy, and receipt-freeness. Trusted authorities and computational assumptions are only needed during vote casting and tallying to prevent the creation of invalid ballots and to achieve receipt-freeness and fairness, but not to guarantee vote privacy. The implementation of everlasting privacy is based on perfectly hiding commitments and non-interactive zero-knowledge proofs, whereas receipt-freeness is realized with mix networks and homomorphic tallying.

Keywords Verifiable elections – Everlasting privacy – Receipt-freeness – Zero-knowledge proofs

 

SecIVo: a quantitative security evaluation framework for internet voting schemes

Stephan Neumann1, Melanie Volkamer1,2, Jurlind Budurushi1, Marco Prandini3

(1) Technische Universität Darmstadt, Germany
(2) Karlstad University, Sweden
(3) Università di Bologna, Italy

Abstract Voting over the Internet is subject to a number of security requirements. Each voting scheme has its own bespoke set of assumptions to ensure these security requirements. The criticality of these assumptions depends on the election setting (e.g., how trustworthy the voting servers or the voting devices are). The consequence of this is that the security of different Internet voting schemes cannot easily be compared. We have addressed this shortcoming by developing SecIVo, a quantitative security evaluation framework for Internet voting schemes. On the basis of uniform adversarial capabilities, the framework provides two specification languages, namely qualitative security models and election settings. Upon system analysis, system analysts feed the framework with qualitative security models composed of adversarial capabilities. On the other side, election officials specify their election setting in terms of—among others—expected adversarial capabilities. The framework evaluates the qualitative security models within the given election setting and returns satisfaction degrees for a set of security requirements. We apply SecIVo to quantitatively evaluate Helios and Remotegrity within three election settings. It turns out that there is no scheme which outperforms the other scheme in all settings. Consequently, selecting the most appropriate scheme from a security perspective depends on the environment into which the scheme is to be embedded.

Keywords Internet voting – Security evaluation- Security requirements

Open Topics

Exact approach for the optimal design of virtual private network trees assuming a hose workload

Ali Lourimi, Boulbaba Thabti, Habib Youssef

University of Sousse, Tunisia

Abstract Virtual private network (VPN) design according to a tree topology has been the subject of numerous research papers. Two workload models are commonly used to allow VPN clients to specify the communication capacity they need, the hose and the pipe workload models. As opposed to the pipe model, where bandwidth needs between every pair of endpoints must be specified as a matrix, the hose model has the advantage of simple specification where only one ingress and egress bandwidths per hose endpoint are specified. However, the tree bandwidth costs obtained with the hose workload model are higher by a factor of as much as 2.5 compared to those obtained with pipe workloads Duffield et al. (SIGCOMM Comput Commun Rev 29(4):95108, 1999). In this work, we propose a two-step exact approach to design a VPN tree with minimum bandwidth cost. The first step derives a pipe workload from the user specified hose workload using an exact algorithm. The second step formulates the pipe-based VPN tree bandwidth minimization as a 0–1 integer linear program, which is solved using the exact approach proposed in Thabti et al. (1–6, 2012). The bandwidth costs of VPN trees obtained using this two-step approach are lower by a factor varying between 1.31 and 2.23 compared to VPN trees obtained using the original hose workload. Furthermore, we show that tree solutions obtained using the derived pipe workload satisfy the original hose workload.

Keywords Virtual private network – Hose model – Branch-and-cut – Maximum flow problem – Cutting plane – Separation problem

 

A novel scaling and early stopping mechanism for LTE turbo code based on regression analysis

T. P. Fowdur, Y. Beeharry, K. M. S. Soyjaudah

University of Mauritius, Réduit, Mauritius

Abstract In this paper, a new extrinsic information scaling and early stopping mechanism for long term evolution (LTE) turbo code is proposed. A scaling factor is obtained by computing the Pearson’s correlation coefficient between the extrinsic and a posteriori log-likelihood ratio (LLR) at every half-iteration. Additionally, two new stopping criteria are proposed. The first one uses the regression angle which is computed at each half-iteration and is applied at low Eb/N0. The second one uses Pearson’s correlation coefficient and is applicable for high Eb/N0 values. The performance of the proposed scheme was compared against an existing scaling and stopping mechanism based on the sign difference ratio (SDR) technique as well as conventional LTE turbo code. Simulations have been performed with both quadrature phase shift keying (QPSK) modulation and 16-quadrature amplitude modulation (QAM) together with code rates of 1/3 and 1/2. The results demonstrate that the proposed scheme outperforms both the conventional scheme and that employing the SDR-based scaling and stopping mechanism in terms of BER performance and average number of decoding iterations. The performance analysis using EXIT charts for each scheme shows higher initial output mutual information for input mutual information of zero. Better convergence is also demonstrated with the wider tunnel for the proposed scheme. Additionally, the computational complexity analysis demonstrates a significant gain in terms of the average number of computations per packet with the different modulation and coding schemes while still gaining in terms of error performance.

Keywords Regression angle – Correlation coefficient – Early stopping – Scaling and turbo code

 

Power allocation for device-to-device communication underlaying cellular networks under a probabilistic eavesdropping scenario

Junyue Qu1, Yueming Cai1, Jianchao Zheng1, Wendong Yang1, Dan Wu1, Yajie Hu2

(1) College of Communications Engineering, Nanjing, China
(2) PLA, Jinan, China

Abstract In this paper, we focus on the issue of security due to the open structure of the D2D Communication Underlaying Cellular Networks. In such an open scenario, the problem of interference is very serious. But luckily, the interference can be helpful from a perspective of the physical layer security. The interference caused by D2D communication could be helpful against eavesdroppers to enhance the secure communication of the cellular users when the value of the interference is proper. Note this, the physical layer security of the cellular users can be enhanced with the proper interference management based on the power allocation in D2D communication underlaying cellular networks in a probabilistic eavesdropping scenario. The problem is modeled as a Stackelberg game model. In the model, all cellular users are modeled as followers while the D2D pair is modeled as leader. A semi-centralized power allocation algorithm is proposed to converge to the Stackelberg Equilibrium. And the equilibrium is the final power allocation scheme we want. It is proved that the proposed algorithm can conclude in finite-time iterations. Numerical simulation results show that our proposed power allocation algorithm can obtain larger secrecy data rate, than the other two power allocation algorithms.

Keywords Device-to-device – Probabilistic eavesdropping – Stackelberg game – Power allocation – Security

 

A multi-GNSS software-defined receiver: design, implementation, and performance benefits

Stefan Söderholm, Mohammad Zahidul H. Bhuiyan, Sarang Thombre, Laura Ruotsalainen, Heidi Kuusniemi

Finnish Geospatial Research Institute, Kirkkonumi, finland

Abstract Global navigation satellite systems (GNSSs) have been experiencing a rapid growth in recent years with the inclusion of Galileo and BeiDou navigation satellite systems. The existing GPS and GLONASS systems are also being modernized to better serve the current challenging applications under harsh signal conditions. Therefore, the research and development of GNSS receivers have been experiencing a new upsurge in view of multi-GNSS constellations. In this article, a multi-GNSS receiver design is presented in various processing stages for three different GNSS systems, namely, GPS, Galileo, and the Chinese BeiDou navigation satellite system (BDS). The developed multi-GNSS software-defined receiver performance is analyzed with real static data and utilizing a hardware signal simulator. The performance analysis is carried out for each individual system, and it is then compared against each possible multi-GNSS combination. The true multi-GNSS benefits are also highlighted via an urban scenario test carried out with the hardware signal simulator. In open sky tests, the horizontal 50 % error is approximately 3 m for GPS only, 1.8 to 2.8 m for combinations of any two systems, and 1.4 m when using GPS, Galileo, and BDS satellites. The vertical 50 % error reduces from 4.6 to 3.9 when using all the three systems compared to GPS only. In severe urban canyons, the position error for GPS only can be more than ten times larger, and the solution availability can be less than half of the availability for a multi-GNSS solution.

Keywords Multi-GNSS – BeiDou – Galileo – Software receiver – SDR – Performance analysis – Receiver architecture

 

Performance of sigma-delta quantizers in oversampled odd-stacked CMFB systems

Fatma Abdelkefi1, Jaouhar Ayadi2

(1) University of Carthage, Tunis, Tunisia
(2) ECLEXYS, Riva San Vitale, Switzerland

Abstract In this paper, we investigate the performance of the joint use of odd-stacked cosine modulated filter banks (CMFBs) and the first- and second-order Sigma-Delta (ΣΔ) quantization for communication systems when the signal expansion frame is infinite. This performance is evaluated in terms of the decrease of the reconstruction error of the signal that is jointly represented through the CMFBs and the ΣΔ quantization schemes. To begin with, we derive closed-form expressions of upper-bounds on the signal reconstruction minimum square error (MSE) for both first- and second-order ΣΔ quantization cases. Such upper-bounds are derived irrespectively of any quantization noise assumption that could be made in the considered ΣΔ quantization scheme. Exploiting the obtained upper bound closed-form expressions, we demonstrate that under a set of conditions, this signal reconstruction MSE decays as 1/r2 where r denotes the redundancy of the signal expansion frame. The obtained results are shown to be true under the widely used additive white quantization noise assumption, where we determine also explicit analytical signal reconstruction MSE expressions when the CMFBs are combined with first- and second-order quantizers. Simulation results are given to support our claims.

Keywords First- and second-order Sigma-Delta ΣΔ quantizationOversampled cosine modulated filter banks (CMFB)frameFrame redundancyInfinite frame additive white quantization noiseReconstruction minimum square error (MSE)

Comments are closed.