• No results found

Quality of experience and quality feedback

N/A
N/A
Protected

Academic year: 2022

Share "Quality of experience and quality feedback"

Copied!
49
0
0

Loading.... (view fulltext now)

Full text

(1)

Quality of Experience and Quality Feedback

Markus Fiedler

Blekinge Institute of Technology School of Engineering

Dept. of Telecommunication Systems Karlskrona, Sweden

HET-NETs’06 Tutorial T13 Ilkley, UK, Sept. 2006

(2)

My Own Background (1)

Moved from the network towards the user ☺

• Working with Grade of Service/Quality of Service issues since 1992

– Admission control, dimensioning

• Got interested in end-user throughput perception in 2000

– “Kilroy”-Indicator 2002 co-developed with Kurt Tutschku, University of Würzburg

• E-Government project 2002—2004 – Implications of IT problems

• Preparation of the NoE EuroNGI 2003

(3)

EuroNGI-Related Activities

• Leader of

– Joint Research Activity JRA.6 “Socio- Economic Aspects of Next Generation Internet”

– Work Package WP.JRA.6.1 “Quality of Service from the users’ perspective and feedback mechanisms for quality control”

– Work Package WP.JRA.6.3 “Creation of trust by advanced security concepts”

• EuroNGI-sponsored AutoMon project (2005) – Improved discovery of end-to-end problems – Improved quality feedback facilities

(4)

My Own Background (2)

• Projects within Intelligent Transport Systems and Services since 2003

– Timely delivery is crucial (dependability, safety)

– Network Selection Box (GPRS/UMTS/WLAN) – How to match technical parameters and user

perception?

• Surprised that rather little attention has been paid to user-related issues by “our” scientific community

(5)

Thesis 1:

Users do have – sometimes unconscious –

expectations regarding ICT performance

(6)

Quality Problems?!?

(7)
(8)

Perception of Response Times

100 ms 1 s 10 s Response

time

Reacts promptly

There is a delay

Flow of thoughts interrupted

Un- interesting Boring

• Most users do not care about “technical”

parameters such as Round Trip Time (RTT),

one-way delay, losses, throughput variations, ...

(9)

Some User Reactions (1)

• Study by HP (2000) [1]

• Test customers were exposed to varying

latencies when composing a computer in a web shop and had to rate the service quality

• Some of their comments are found below:

• Understanding that there’s a lot of people

coming together on the process makes us more tolerant

• This is the way the consumer sees the

company...it should look good, it should be fast

(10)

Some User Reactions (2)

• If it’s slow I won’t give my credit card number

• As long as you see things coming up it’s not nearly as bad as just sitting there waiting and again you don’t know whether you’re stuck

• I think it’s great...saying we are unusually busy, there may be some delays, you might want to visit later. You’ve told me now. It I decide to go ahead, that’s my choice.

• You get a bit spoiled. I guess once you’re used to the quickness, then you want it all the time

(11)

Consequences?

[2] summarises:

• 82% of customer defections are due to

frustration over the product or service and the inability of the provider/operator to deal with this effectively

• ... on average, one frustrated customer will tell 13 other people about their bad expeciences ...

• For every person who calls with a problem, there are 29 others who will never call.

• About 90% of customers will not complain

before defecting – they will simply leave once they become unsatisfied.

Shortcomings in

perceived dependability are likely to cause churn!

(12)

Quality of Service (QoS)

• Telecom view

– ITU-T E.800 (1994) defines QoS as “the collective effect of service performance

which determine the degree of satisfaction of a user of the service”, including

• Service support performance

• Service operability performance

• Serveability (Service accessibility/

retainability/integrity performance)

• Service security performance

– QoS measures are only quantifiable at a service access point

(13)

Quality of Service (QoS)

• Internet view

– Property of the network and its components

• “Switch A has Quality of Service”

– Some kind of “Better-than-best-effort”

packet forwarding/ routing

• RSVP

• IntServ

• DiffServ

• Performance researcher view

– Results from queuing analysis

(14)

Quality of Experience (QoE) [2, 3]

• Rather new concept, even more user-oriented than QoS: “how a user perceives the usability of a service when in use – how satisfied he or she is with a service” [2].

• Includes

– End-to-end network QoS

– Factors such as network coverage, service offers, level of support, etc.

– Subjective factors such as user expectations, requirements, particular experience

• Economic background: Dissapointed user may leave and take others with him/her.

(15)

Quality of Experience (QoE)

• Key Performance Indicators (KPI)

– Reliability (service quality of accessibility and retainability)

• Service availability

• Service accessibility

• Service access time

• Continuity of service

– Comfort (service quality of integrity KPIs)

• Quality of session

• Ease of use

• Level of support

• Need to be measured as realistically as possible

(16)

Thesis 2:

There is a need for more explicit feedback

to make the user feel more confident

(17)

Cf. [4]

Section 2.4

Typical

Feedbacks

(18)

Types of Feedback

• Explicit feedback

– Positive/negativ acknowledgements

• E.g. TCP

– Asynchronous notifications

• E.g. SNMP traps

• Implicit feedback

– Can be obtained through observing whether/how a process is happening – Dominating Internet as of today

(19)

1. Feedback from the Network

a. Network Application

• Implicit: No or late packet delivery b. Network Network Provider

• Classical Network Management/monitoring c. Network User

• Implicit: “Nothing happens...”

• Rudimentary tools available

• Operating system issues warnings Within the network stack: control packets

(20)

2. Feedback from the Application

a. Application Application

• Some applications measure the performance of the packet transfer and adapt themselves (e.g. Skype, videoconferencing)

b. Application User

• Implicit by not working as supposed

• Explicit by notifying the user or adapting itself

c. Application Service Provider

• Active measurements of service performance d. Application Network Provider

• Monitoring of control PDUs

(21)

3. Feedback from the User

Implicit: give up / go away = churn Explicit:

a. User network operator

• Blame the closest ISP

• Not uncommon ISP attitudes:

• The problem is somewhere else

• The user is an idiot b. User service provider

• Online quality surveys c. User application

• Change settings

(22)

4. Feedback from the Service Provider

• Towards the network operator in case of trouble

• Part of the one-stop service concept [4]:

– Service provider = primary point of contact for the user of a service

– User relieved from having to search for the problem (which is the service provider’s business)

(23)

The Auction Approach

Cf. [5]

Chapter 5

(24)

Feedback Provided by Bandwidth Auctions

a. Bidding for resources on behalf of the user b. Signaling of success or failure

c. Results communicated towards the user

• Successful transfer at resonable QoS

• Unsuccessful transfer at low cost

d. Results communicated to network (and perhaps even service) provider

• Dimensioning

• SLA

(25)

The AutoMon Approach

Cf. [5]

Chapter 6

(26)

AutoMon Feedback

• DNA (Distributed Network Agent) = main

element in a self-organising monitoring overlay a. Local tests using locally available tools

b. Remote tests and inter-DNA communication

• Comparison of measurement results

c. Alarms towards {network|service} provider(s) in case of perceived problems

• E.g. using SNMP traps

d. Lookup facilities for providers

• E.g. saving critical observations in a local MIB e. Notification facilities towards users

• Not mandatory, but maybe helpful

(27)

The AutoMon Project

• Design and Evaluation of Distributed, Self- Organized QoS Monitoring for Autonomous

Network Operation (http://www.informatik.uni- wuerzburg.de/staff/automon)

• Sponsored by the Network of Excellence EuroNGI (http://www.eurongi.org)

• Partners (and Prime Investigators)

– Infosim GmbH & Co. KG, Würzburg (S.

Köhler, M. Schmid)

– University of Würzburg, Dept. of Distributed Systems (K. Tutschku, A. Binzenhöfer)

– Blekinge Institute of Technology, Dept. of Telecomm. Systems (M. Fiedler, S. Chevul)

(28)

The AutoMon Concept

1. DNA = Distributed Network Agent – Self-organising

– Prototype available

• Network operations

• Simulations

2. NUF = Network Utility Function – Quality evaluation:

user impairment = f (network problems) – Focus on throughput (TUF)

3. QJudge = Demonstrator for

– Quality evaluation (traffic-lights approach) – Feedback generation (traps)

– MIB

(29)

The Way To Autonomous Networks

IT-System e.g. LAN/MAN

Autonomous Manager

Input Output

Autonomous Manager

Autonomous Manager

Act Observe

Analyze

(30)

Disadvantages of a Central Monitor Station

NMS

?

?

?

? ?

?

?

Mailserver

Webserver

Client B Client A

Client D

Backup Server Client C

Link status: up down

?

unknown

(31)

Mailserver

Webserver

Client B Client A

Client D

DNS Server Client C

DNA

DNA

DNA

DNA

DNA

DNA

DNA

DNA reroute

Extended view

temporary DNS proxy

Link status: up down

?

unknown

Advantages of Distributed Monitoring

NMS

(32)

DNA Phase 1: Local Tests

Cable?

IP?

DNA Ping!

-NIC-Status

-NetConnectionStatus -PingLocalHost

-IPConfiguration -DNSConfiguration -DHCPLease

-EventViewer

-HostsAndLmHosts -RoutingTable

-PingOwnIP

-PingWellKnownHost

(33)

DNA Phase 2: Distributed Tests

Server DNA

Ping!

Test please Test

please

DNA

Ping!

Ping!

Result

DNA

Result

-PingSpecificHost

-PingWellKnownHosts -DNSProxy

-RerouteProxy -PortScan

-Throughput -Pinpoint Module

Server

(34)

The DNA Overlay Network

Internet

DNA

DNA DNA

Use of a P22-based overlay network

• DHT = Kademlia

• Peer = DNA

DNA

DNA

DNA

DNA

Challenges:

- keep overlay connected

- locate specific DNA - locate random DNA

(35)

Scalability Results Using the DNA Prototype

0 200 400 600 800 1000 1200

100 150 200 250 300 350 400 450 500

Overlay size

Average search duration [ms]

Average online time = 60 min No churn

(36)

Network Utility Function

Internet

DNA

DNA

U

In

U

Netw

U

Out

= U

Netw

• U

In

evaluate original

quality

evaluate quality of the network

evaluate received

quality

(37)

Network Utility Function

• Range of U: 0 (worst) ... 100 % (best) – intuitive for – Users

– Providers – Operators

• Captures performance-damping effect of the network

– UNetw = 1 network “transparent”

• Bad service perception (Uout 0) can have its roots in

– Badly performing network (UNetw 0) – Badly performing application (Uin 0)

U

Out

= U

Netw

• U

In

(38)

Throughput Utility Function

• Basis: Throughput

– on small time scales ΔT

– during observation interval ΔW

• m-utility function

U

m:

captures impact of changes in traffic volume – Overdue traffic ( late or lost)

• s-utility function

U

s:

captures impact of changes in traffic burstiness – Shaping = reduction ( throttle)

– Sharing = increase ( interfering traffic)

• n-utility function

U

n:

– Bias by network (e.g. UMTS vs. LAN)

U

Netw

= U

m

• U

s

U

n

(39)

Recent Skype-via-UMTS results:

PESQ and NUF/TUF [6]

PESQ = Perceptual Evaluation of Speech Quality NUF = Network Utility Function

TUF = Throughput Utility Function

l

(40)

SNMP Interface

• Trap generation

– Upon threshold crossing, e.g.

• Green Yellow

• Yellow Red

• (Enterprise) MIB

– Not yet designed

– Cf. RMON history group

• Statistics (m, s)?

• Array with values?

• Histograms?

• Why not just participate in the overlay? ;)

Simple parameters for monitoring of Skype:

U

Netw

= U

m

≥ 80 %

U

Netw

= U

m

≥ 50 %

U

Netw

= U

m

< 50 %

(41)

Thesis 3:

The user needs to be relieved from

decisions based on incomplete feedback

(42)

Status

Internet usage still implies a high degree of self-service

• Some kind of Internet paradigm (just provide connectivity, the rest is left to the user)

• The “Anything-over IP-over-anything” principle provides both opportunities and nightmares

• Mastered differently by different applications (better by some, worse by others)

• A lot of “decision making” is left to the user – does (s)he really know about the implications?

• Recent trend towards IMS (Internet Multimedia System): might help, but will the Internet

community accept it?

(43)

Status

Issues:

• How do subjective QoE and objective QoS parameters match each other?

– More or less solved for some applications

• How can I be sure that

– “my” task is performed and completed

– “my” problems are detected and worked on in time?

• Which network can be used for a particular task?

– Rough indications available

• “Money back” policies?

– cf. airlines and (some) train companies

Solving these issues increases dependability perception and thus trust

(44)

Wish-list

• No additional complexity for the user!

– Application of self-organisation principles

• Preventive feedback:

– Clear guidelines and indications regarding (im-)possibilities

• Optional cross-layer interfaces required

• Reactive feedback:

– Signalling of success or failure

• Again a matter of cross-layer interfaces – Action on behalf of the user

• Notifications

• Selections (e.g. a particular network)

(45)

Wish-list (continued)

• The Internet community should care about end user perception – tendencies visible:

– Next Generation Internet – Internet2

– GENI initiative

• Performance researcher should care about the end user

– What is the use of your studies?

– How can you relate your results to user perception?

(46)

References

1. A. Bouch, A. Kuchinsky, and N. Bhatti. Quality is in the eye of the beholder: Meeting user's requirements for Internet quality of service.

Technical Report HPL-2000-4, HP Laboratories Palo Alto, January 2000.

2. Nokia White Paper: Quality of Experience (QoE) of mobile services: Can it be measured and

improved?

http://www.nokia.com/NOKIA_COM_1/Operato rs/Downloads/Nokia_Services/whitepaper_qoe_

net.pdf

3. D. Soldani, M. Li, and R. Cuny, eds. QoS and QoE Management in UMTS Cellular Systems.

Wiley, 2006

(47)

References

4. M. Fiedler, ed.: EuroNGI Deliverable

D.WP.JRA.6.1.1. State-of-the-art with regards to user-perceived Quality of Service and quality feedback. May 2004.

http://eurongi.enst.fr/archive/127/JRA611.pdf 5. M. Fiedler, ed.: EuroNGI Deliverable

D.WP.JRA.6.1.3. Studies of quality feed-back mechanisms within EuroNGI. May 2005.

http://eurongi.enst.fr/archive/127/JRA613.pdf 6. T. Hoßfeld, A. Binzenhöfer, M. Fiedler, and K.

Tutschku: Measurement and Analysis of Skype VoIP Traffic in 3G UMTS Systems. Proc. of IPS- MoMe 2006, Salzburg, Austria, Feb. 2006, pp 52—61

(48)

CfP

• WP.IA.8.6: First EuroNGI Workshop on Socio- Economic Impacts of NGI

• DTU, Lyngby (Copenhagen), Danmark, Oct. 9—

10, 2006.

• http://eurongi06.com.dtu.dk/

• Still accepting contributions (extended abstracts)

(49)

Thank you for your interest ☺ Q & A

markus.fiedler@bth.se Skype: mfibth

References

Related documents

The effort of such commitment regarding Cybersecurity is exactly what the ITU Global Cyber Security Index (ITU GCI) is meant to measure.. ITU is a United Nations specialised agency

- CQML: this language does not provide mechanisms to support all QoS (e.g. aspects of security cannot be specified). In addition, CQML does not provide mechanisms to

This theoretical article describes and discusses the concept of quality in relation to the evaluation of social-work practice. Of particular interest are the

Table 2: Regressing Ecosystem Water Quality on Government Effectiveness, Level of Democracy, and GDP/Capita Among All Countries and Among Countries With Real Measured Values For

Using finely grained data from the Vietnam Provincial Governance and Public Administration Performance Index between 2011–2017, we show that communes that experience increases

In this paper, it is argued that the effects of people’s green values and social trust on environmental tax support is contingent on the perceived quality of government (QoG):

As CloudMAC runs entirely in an OpenFlow based network, the traffic control extensions that were made to Open vSwitch were used to test if traffic control could affect the

As CloudMAC runs entirely in an OpenFlow based network, the traffic control extensions that were made to Open vSwitch were used to test if traffic control could affect the