Many of your Android apps send unnecessary hidden data

Jon Fingas, Engadget, 11/23/15

It won’t shock you to hear that Android apps send a lot of data, but you may be surprised at how much of it isn’t really necessary… or public, for that matter. MIT researchers have determined that “much” of the hidden data sent and received by the 500 most popular Android apps isn’t necessary to the functionality. For example, a Walmart app talks to eBay whenever you scan a barcode — there’s no practical difference when you sever that connection. Out of the 47 apps that MIT modified to prove its case, 30 were virtually indistinguishable from the official versions. The rest only had minor issues, like missing ads.

This doesn’t mean that the data itself is suspicious, or that the issue is Android-specific. Half of it boils down to analytical data like crash and performance reports, which are present on iOS and other platforms. Some of it may simply help the app run more effectively, such as fetching content so that the app keeps working if you’re knocked offline. The concern is more that these titles don’t say what they’re doing with these communications. While the activity is likely to be innocuous, there’s a concern that a less-than-careful app developer could put your info at risk without a good reason.

Posted on November 23, 2015 at 8:35 am by lesliemanzara · Permalink · Leave a comment
In: Android, iOS, Mobile Technology

Accelerated Mobile Pages is Google’s attempt to ‘unsuck’ the mobile web

DeviceAtlas, 11/16/15

Smartphones are the main way to access websites for a growing number of users, even though the mobile web can be extremely slow. This is mainly due to the fact that the website content is often bloated with images, scripts and ad trackers that aren’t tailored for mobile devices. Google’s Accelerated Mobile Pages (AMP) initiative is aimed to tackle these issues by introducing a set of rules and restrictions for websites sent to mobile devices.

Mobile web needs improvements

The DeviceAtlas team has always been a keen advocate of improving web performance on all devices. In fact there is a plethora of ways that web developers can choose to optimize their pages for mobile devices, from building adaptive sites (like the largest players do), to adding server-side components to their responsive sites which make them lighter and faster. However not all developers use these options for a number of reasons and this is what can make the mobile web suck.

Accelerated Mobile Pages are based on a new framework called AMP HTML aimed to allow building lightweight, rich websites that include text, images, videos and smart ads that load instantaneously on mobile devices.

The initiative is basically yet another optimization technique although it forces developers to follow a set of pre-determined rules and restrictions. AMP decide what can and what can’t be a part of an approved website. For example, no third-party or developer-written JavaScripts, external style sheets are not allowed, etc.

It sounds great from a user’s point of view. With AMP users just get what they’re looking for when browsing on-the-go: the content.

You can test AMP sites by entering the following address in your mobile browser:

DeviceAtlas CTO on Accelerated Mobile Pages

But what does AMP mean for the world of web development? Here’s what our expert Ronan Cremin, the DeviceAtlas CTO, had to say about the latest Google’s initiative:

First off, it clearly helps to speed up load times. The AMP version of The Verge loads about 10x faster for me than the normal version.

Secondly, it’s clearly better than the Facebook Instant Articles initiative because it’s not tied to Facebook. Also, publishers can optionally choose to avail of Google’s free caching and anyone else can implement a caching system if they want.

Many people have commented that all this is a set of limitations and that you don’t need AMP to do this. This is correct, though AMP does use some tricks to render quickly also.

Lots of people have also pointed out that the limitations it imposes are quite draconian, e.g. no JS at all and make the case that JS isn’t necessarily bad, which is true. You can design rich pages that load quickly.

But I do think it has value:

  • It seems that lots of web developers (perhaps less experienced) actually don’t realise what you need to do to achieve high performance, because they got started when jQuery etc. was already standard.
  • AMP gives you an easily testable package of recommendations (add “#development=1” to any AMP page)
  • It allows sites to make a (testable) promise: this page will work well.
  • Even if people choose not to use it, it may be useful in helping to steer people the right way, i.e. to think twice about their development practices in light of what Google is recommending. People pay attention to what Google says.

As a user, if I’m presented with an AMP link and a non-AMP link I’ll pick the AMP one without hesitation. The load time issue really is huge. If I remember that a site has opted to implement AMP I’ll be far more likely to visit again. Any time I see a link I think twice before I click on it.

The troubling aspects of AMP pages are:

  • It requires JavaScript and the pages won’t load without it. This means smartphone only (and they’re open about this). Given that the JS is centrally hosted it will get cached quickly so the load time of the JS itself doesn’t bother me.
  • Perhaps this is Google building a ‘walled garden’, e.g. in the future they may prioritize AMP pages over non-AMP pages in the search results, even if the non-AMP versions work just as well. I would like to believe that this won’t be the case because, ultimately, if Google is not delivering the best results to users it will fail.
  • It doesn’t do anything to decrease image weight, ~70% of the size of the average page. It doesn’t preclude solutions either. It just feels like this was a lost opportunity.
  • It feels over restrictive at a time when the web is richer than ever. It’s about static pages, a web of documents rather than apps.
AMP indicates that the mobile web has a performance problem

It’s not really surprising that Google’s Accelerated Mobile Pages cause controversies especially among web developers and website owners who are wary of being forced to use only one solution that was built and chosen by someone else. Especially given that the company behind AMP is the ‘big G’ who also happen to own the search engine that for many sites is the largest source of traffic.

This doesn’t change the fact that the mobile web needs improvements and Google’s Accelerated Mobile Pages is a step in the right direction. The initiative clearly indicates that web performance issues need to be addressed in the near future.

Posted on November 16, 2015 at 12:48 pm by lesliemanzara · Permalink · Leave a comment
In: Mobile Technology

Wi-Fi Alliance presents coexistence guidelines; LTE-Forum revises specs to ensure LTE-U coexistence with Wi-Fi

, FierceWireless, 11/9/15

Verizon, T-Mobile, Google attend workshop to spark dialog on fair sharing

While the Wi-Fi Alliance conducted its first workshop aimed at harmonizing Wi-Fi and LTE-U yesterday, the LTE-U Forum, based on feedback from the Wi-Fi community, has revised  its specifications for LTE-U to be “doubly sure” that LTE-U coexists with Wi-Fi and to provide a “smooth upgrade path” from LTE-U to LAA.

“These changes will also improve LTE-U performance,” Qualcomm said in a statement. Qualcomm has been one of the biggest proponents of LTE-U, along with operators Verizon and T-Mobile US .

The Wi-Fi Alliance this week presented its coexistence guidelines for LTE in unlicensed spectrum in what it described as an attempt to provide a common basis on which future Wi-Fi/LTE coexistence studies are conducted and to offer a foundation on which to build an unlicensed LTE test plan. Qualcomm and other members of the LTE-U Forum were invited to attend the full-day workshop in Palo Alto, Calif., yesterday.

Qualcomm told FierceWirelessTech that it was pleased to participate in the workshop and looks forward to continuing collaboration with the Wi Fi Alliance. “For the past two years, as Qualcomm and its partners developed LTE-U, we have had an ongoing dialogue with the Wi-Fi community to ensure that LTE-U does not cause any adverse impact on Wi-Fi,” the company said. “We have a strong vested interest in ensuring that LTE-U coexists successfully with Wi-Fi in view of our own Wi-Fi business.”

The Wi-Fi Alliance said its guidelines consider myriad use cases where Wi-Fi is used today, including video streaming, VoIP and dense and dynamic Wi-Fi environments. The guidelines focus on key performance indicators that matter for coexistence, expected real-world topologies of mixed Wi-Fi and LTE-U environments, and what the group calls realistic network loading scenarios.

Edgar Figueroa, president and CEO of the Wi-Fi Alliance, told FierceWirelessTech last month that the guidelines and testing may evolve based on industry input, and he reiterated that again this week. “Cooperation among a broad cross-section of industry provides the best opportunity to deliver a viable solution for fair coexistence,” he said in a press release.

The guidelines mention two categories of Wi-Fi coexistence studies: Wi-Fi Baseline, where the performance of one or more Wi-Fi networks operating on the same channel is measured; and Wi-Fi and LTE Coexistence, where one or more Wi-Fi access points is replaced by an LTE device so that the impact of LTE on Wi-Fi performance can be assessed.

One of the objectives of the workshop was to bring all the stakeholders together and talk about how to resolve the concerns that people in the Wi-Fi industry have about LTE adversely impacting Wi-Fi users if introduced in unlicensed bands. Verizon, AT&T, T-Mobile, as well as Amazon, Broadcom, Dell, Ericsson, Google, Intel, NTT and Gibson Brands were invited to participate in the workshop.

“The Wi-Fi Alliance’s efforts are a useful and important step to protect consumers’ interests — ensuring that Wi-Fi thrives while supporting innovation in unlicensed spectrum — and we’re heartened so many companies are participating,” said Bill Maguire, director of WifiForward’s Save our Wi-Fi Campaign, in a statement. “Wi-Fi and the other unlicensed technologies we depend on today did not happen by magic; they are the result of a collaborative multi-stakeholder, standards-setting process driven by the private sector, academics and independent engineers. The test plan discussed today shows that the process to develop LTE-U so far has lacked important aspects. We hope these guidelines will help all stakeholders to move back to the standards-setting process for the development of LTE-U.”

Indeed, unlike Licensed Assisted Access (LAA), LTE-U was developed outside the standards bodies via the LTE-U Forum with Verizon, Qualcomm, Alcatel-Lucent, Ericsson and Samsung as initial members. T-Mobile later threw its support toward the forum’s efforts, and last month a coalition that includes many of the same in addition to AT&T, CTIA and the Competitive Carriers Association (CCA) launched Evolve, a coalition designed to promote the benefits of unlicensed spectrum and new technologies like LTE-U and LAA.

CTIA has not had an opportunity to review the guidelines, but “we share the goal of demonstrating that consumers will benefit from LTE in unlicensed bands without harming unlicensed users,” said Tom Sawanobori, SVP and CTO at CTIA, in a statement. “Any guidelines should reflect real-world use cases and recognize the extensive testing that has already been done that repeatedly shows Wi-Fi and LTE in the unlicensed bands coexist well together. As we have said all along, Wi-Fi remains an important offloading capability that wireless carriers need to meet Americans’ mobile demands, and adding LTE in the unlicensed bands will only enhance user experience.”

In August, Verizon, T-Mobile and Qualcomm were among a group expressing concerns that the Wi-Fi Alliance was trying to act like a “gatekeeper” by insisting that the FCC hold off certifying LTE-U equipment until the Wi-Fi Alliance could develop a coexistence test plan and complete its own coexistence evaluation program of LTE-U’s impact on Wi-Fi. The LTE-U Forum released etiquette protocols and coexistence testing requirements in early March and has said it is working to educate the unlicensed community about the technology, providing detailed technical specifications and extensive coexistence test results.

Before the workshop, the Wi-Fi Alliance sponsored research to gauge consumer attitudes about Wi-Fi. Not surprisingly, a large majority of employed adults said not having Wi-Fi access would impact their work productivity, with the largest impact taking place at home. Seventy-three percent of respondents said it’s very/somewhat important to always have access to Wi-Fi in their daily life, while millennials (ages 18-24) are more likely to indicate this than any other group.

Kevin Robinson, vice president of marketing at the Wi-Fi Alliance, noted that a significant percentage of respondents 65 years and older shared that sentiment, indicating that Wi-Fi is a technology that spans generations. What was surprising is that according to the survey, which was conducted online in the United States by Harris Poll, 23 percent of Americans have heard of the term LTE-U. Robinson said some people probably keyed in on the “LTE” part of the question and not exactly the LTE-U in unlicensed spectrum that many in the industry are concerned about.

Both the Wi-Fi Alliance and the LTE-U Forum members have said they want to avoid regulatory intervention. The FCC in May took the somewhat unusual step of releasing a public notice seeking comment on LTE-U and LAA after it heard concerns about potential interference in unlicensed bands. Last month, some companies continued to urge the FCC to scrutinize LTE-U proponents.

This week’s workshop was the only one on the calendar but more are expected to follow, likely after the holidays.

Posted on November 9, 2015 at 12:59 pm by lesliemanzara · Permalink · Leave a comment
In: Mobile Technology · Tagged with: , , , , ,

Even Samsung’s tiny Tizen OS is now bigger than BlackBerry


Vlad Savov, The Verge, 11/9/15

In a mobile market dominated by Apple’s iPhone and Google’s stable of Android devices, there’s very little room for any more competitors. Microsoft can only muster a couple of percentage points as the distant third-place contender, and everyone else’s share is measured in mere fractions. As bad as that was, at least BlackBerry could count itself fourth and see its name on Gartner and IDC’s smartphone market share reports.

According to Strategy Analytics today, however, BlackBerry has lost even the fourth spot on the market, having been superseded by Samsung’s Tizen smartphones. Tizen OS is Samsung’s initiative to construct an operating system of its own, and it’s being used on this year’s range of Samsung smart TVs, the Gear S2 smarwatch, and a couple of entry-level smartphone models. Those Tizen phones, in all their Android-imitating glory, have apparently proven sufficiently alluring to generate greater sales than BlackBerry is achieving with all of its devices.

From here on, it’s Android or bust

The report from Strategy Analytics indicates what we all might have guessed: Microsoft, BlackBerry, and Firefox have all “drifted down” in their market share, while demand for new iPhones has driven Apple up. There’s little question, in light of BlackBerry’s continuing decline, that the Canadian company had to switch to Android if it was to stand any chance of remaining a player in the mobile market. The Priv smartphone marks that switch, and much rides on its success. Though even if BlackBerry survives as a mobile manufacturer over the long term, its operating system is probably already done. There’ll be extended support for all those enterprise clients BlackBerry already has, but the future for this company looks like it’s going to be Android or bust.


Posted on November 9, 2015 at 12:50 pm by lesliemanzara · Permalink · Leave a comment
In: Android, Blackberry, iOS, Mobile Technology, WinPhone

Trojan adware on Android can give itself root access

Android has taken quite a beating lately on the security front. In part, that’s due to a perfect storm consisting of a large user base, slow software and security updates, and the ability to sideload untrusted apps that could potentially steal data or crash the phone. Against that backdrop, antivirus maker Lookout has posted a warning about a wave of novel malware with a nasty side effect: giving itself root privileges.

The attack vector is pretty straightforward. An attacker downloads a popular app like Facebook or WhatsApp from the Google Play store and then injects it with one or more root exploits. The trojan app then gets uploaded to a third-party repository. When a user downloads and installs the infected app, the root exploit payload works in the background, attempting to gain root access on the infected device.

So far, Lookout has found three distinct forms of this kind of attack, named ShiftyBug, Shuanet, and Shedun (also known as GhostPush). The root exploits they use are often the same ones found in popular root-enabling software packages, like ExynosAbuse and Framaroot.

Lookout says users may never know the cause of any issues they might have after their devices are infected, because the infected app seems to work correctly most of the time. Even worse, in some cases, the app can write itself to protected system storage, meaning that not even wiping the phone’s user-accessible storage can remove the payload. That means infected phones could potentially have to be replaced entirely.

The best way to stay safe seems to be sticking to official distribution channels. While official app stores can have their own security problems, they’re no doubt safer than using third-party sites relying on user—rather than developer—submissions.

Posted on November 9, 2015 at 12:46 pm by lesliemanzara · Permalink · Leave a comment
In: Android, Mobile Technology · Tagged with: 

HTML5 On The Rise: No Longer Ahead Of Its Time

, TechCrunch, 11/3/15

Today marks the one-year anniversary of the World Wide Web Consortium (W3C) declaring the HTML5 standard complete, a significant milestone in the history of the Internet and web application development. In this past year, HTML5 adoption has gone into overdrive, with more and more companies moving to HTML5 to deliver rich cross-platform web applications. The most recent examples include Amazon, Facebook, Google and YouTube, which transitioned from Adobe’s Flash to HTML5. Why is that? And why now?

For years, millions of developers have standardized their application development on HTML5 to meet business demand for a seamless and superior user experience across all devices and screens. But in 2015, HTML5 has really emerged as a reliable and universal choice for building enterprise-class software that companies can use to deliver rich, web experiences, as they continue to move toward a mobile-first strategy.

The power of HTML5 has been clear for a long time. In 2012, industry influencer Mark Zuckerberg gave it flack, only to change his story three years later, adopting HTML5 technology to beef up Facebook’s news feed (see our demo in this side-by-side comparison).

There are three concurrent industry trends driving this shift toward web application adoption and HTML5 development…

Technical Innovation Is, Finally, Catching Up

The transition to HTML5 is powered not only by the continuous performance and feature innovation in the programming language itself, but also by the improvements in modern browsers in which web apps run. Browsers are now significantly faster than just a few years ago. Once dominated by Microsoft Internet Explorer, browsers such as Google Chrome are forcing the market to develop faster and more efficient solutions. With the fading of Flash, HTML5’s rich multimedia capabilities are capturing developer’s attention as they can perform tasks within the browser’s basic functionality — features that in the past required users to download and install plug-ins.

In this perfect intersection of technical innovation, developer preference and enterprise need, I’m both hopeful and excited about what HTML5 will enable in the years ahead.

Another technical innovation driving the adoption of HTML5 is the processing performance of mobile devices, like Samsung’s smartphones that have an impressive 8-core processor.

These technical advancements are harnessing the power and ubiquity of HTML5, making it the emerging standard in the enterprise.

Businesses Crave It

Organizations are under immense pressure to deliver highly sophisticated web and mobile applications to their customers. At the same time, customers expect to access these applications on a wide range of devices, including desktops, tablets and smartphones. Not only are customer expectations rapidly increasing, but so is the rate of change. To keep pace with industry demands, enterprises are investing in technologies that help them meet their customers’ cross-platform web and mobile application needs, both now and in the future.

With its write once, deploy anywhere capabilities, HTML5 empowers companies to design, build and manage apps with greater sophistication and complexity across multiple platforms and devices in the same amount of time.

Developers Won’t Live Without It

Even as the digital environment gains complexity and sophistication, development teams remain under constant pressure to deliver complex apps, faster. That’s why they are opting for HTML5, as illustrated by a recent Strategy Analytics survey on mobile application developers’ preferences and attitudes toward app development. Researchers found that out of all the technologies for building native or web apps, HTML5 showed the strongest predicted growth at 20 percent, with 63 percent of all business apps being created in HTML5.

For developers, one key attraction to HTML5 is its open standards support, which helps them deliver on application requirements in the face of fragmented mobile devices, form factors, platforms and operating systems. Developers can use HTML5 to create and present rich content without relying on the device or its operating system, making it a preferred alternative to native.

If given the choice, developers have always preferred coding in a language that translates across platforms — both to ensure a quality user experience on multiple screen sizes and to maintain a skill set that’s applicable for different employers and development requirements. Web app development will continue to accelerate as JavaScript programmers move to HTML5.

Looking Ahead: HTML5 In 2016 And Beyond

As someone who has spent years on the front line with development teams, I am thoroughly impressed by HTML5 and the revolutionizing force it has had on mobile app development. In this perfect intersection of technical innovation, developer preference and enterprise need, I’m both hopeful and excited about what HTML5 will enable in the years ahead.

In the next year, I believe that adoption of HTML5 will grow as enterprises begin to modernize their legacy mandated use of Internet Explorer, allowing employees to also use Chrome or Firefox browsers at work — both of which have superior HTML5 support. And down the road, as adoption for Windows 10 grows, the new Microsoft Edge browser will enable businesses to take full advantage of the power of HTML5.

Posted on November 3, 2015 at 4:11 pm by lesliemanzara · Permalink · Leave a comment
In: Mobile Technology · Tagged with: 

LTE-U: A quick explainer

, ComputerWorld, 11/3/15

LTE-U is a wireless network technology that’s promising a lot, as well as ruffling a few feathers (especially in the Wi-Fi world). Here’s a brief rundown for the perplexed.

OK, so what’s LTE-U?

LTE-U is a system of wireless communication designed to use unlicensed spectrum – which is open to everybody, within certain limits – to ease the burden on big mobile carriers’ networks. Regular LTE is the system they use to transmit and receive information across their licensed spectrum – to which only they have access. LTE-U (short for Long-Term Evolution in unlicensed spectrum) uses the same “language” to operate on the unlicensed spectrum, which the carriers don’t have to spend billions of dollars to acquire.


Oh, goodness, yes. The FCC has been busy auctioning off the rights to various parts of the spectrum lately – companies bid for the rights to such-and-such a frequency in specific geographic locations in the U.S. – and the last auction took in almost $45 billion.

Cripes, that’s real money, even for Verizon and AT&T.

Sure is. And the reason they’re willing to spend it is that their networks are creaking under the truly crazy demand for data that they’re facing – all that Netflix and YouTube and Twitch and even the stuff that isn’t video (although video is the biggest issue by a long shot) is creating serious capacity problems for the big carriers.

That’s why they’re doing everything they can to stay ahead of it – building new infrastructure, acquiring new spectrum and trying to impose data caps without looking like they’re imposing data caps. LTE-U is part of that, since it would let them offload some of the spiraling demand onto the unlicensed band.

Swell. So what’s the big deal?

Well, there’s this thing called Wi-Fi that operates on the unlicensed band in the 2.4GHz and 5GHz ranges. Which is exactly the piece of spectrum that LTE-U wants to use. Two radio waves in the same physical location at the same frequency means interference, which means crappy service and “ugh-why-doesn’t-this-stupid-thing-work?”

Oh, well, that’s going to annoy just about everybody, huh?

Yep. Qualcomm – which invented LTE-U – swears up and down that they’re incorporating coexistence features that will prevent it from harming existing Wi-Fi installations, and to be fair, it seems highly unlikely that they’re just planning to throw LTE-U out there, your home Wi-Fi be damned. The problem is, though, that we don’t really have any way of knowing that for sure, nor any guarantees that the system will operate the way it ought to.

How come?

Qualcomm didn’t present LTE-U to either of the big wireless industry standards bodies – 3GPP or IEEE – for formal testing and approval, even though they’ve been relatively up-front about what the technology is going to entail. The idea will be to use a system called CSAT (Carrier Sense Adaptive Transmission, before you ask) to make LTE-U stations pause their transmissions for tiny periods to allow Wi-Fi to make use of the same frequencies. The principle is called duty cycling.

That sounds fair enough – but people still have a problem?

Yeah – the thing about duty cycling is that the carriers are the ones in charge of scheduling those pauses, and they’re under no real obligation to provide a decent window of time for Wi-Fi to coexist. Remember, it’s unlicensed spectrum! But since Wi-Fi is what’s known as a polite protocol, it will politely stop talking when the LTE-U is transmitting – and even if it didn’t, all that would happen is the signals crashing into each other and getting garbled. Basically, LTE-U’s coexistence mechanisms aren’t very convincing to some people, and there’s no standards group that has the authority to force it to play nice.

Well, that’s no good – what are the alternatives?

There’s a technology called Licensed Assisted Access or LAA that does roughly the same thing as LTE-U, but folds in a standard called “listen before talk” (LBT), which does pretty much what it sounds like. (Wi-Fi does this.) It’s not a perfect solution to the main coexistence problem, but LTE-U critics say it’s considerably more even-handed than Qualcomm’s plan. LBT is actually a legal requirement in the EU and Japan, so LAA is the only game in those particular towns.

Huh. So why not just use that instead?

Because LAA is a 3GPP standard, and as such is going through a lengthy process of testing and approval – which means that, in places that don’t legally mandate the use of LBT, including the U.S. and China, companies could rush LTE-U to market quicker and help take the pressure off their networks.

If the carriers have this huge demand problem, wouldn’t they want to make it easier for people to use Wi-Fi instead?

Certainly, and LTE-U’s backers have been making this very point at great volume as evidence that LTE-U won’t pose a coexistence problem. Realistically, it doesn’t seem likely that any version of LTE-U that the carriers would release would cause Wi-Fi Armageddon, and the problem seems more likely to be a matter of degrees – if LTE-U helps ease a carrier’s network load, even if it has minor deleterious effects on Wi-Fi networks in an area, they can probably live with that, given that there aren’t any real consequences for them.


To be fair, there’s no need to freak out just yet – Verizon and T-Mobile, the strongest advocates for LTE-U, have said that they’re not planning to roll the technology out until next year, and a lot can happen between then and now. Discussions among industry players are continuing, and FCC chairman Tom Wheeler has hinted that that agency could get involved if the companies can’t come up with a more convincing solution to the coexistence problem.

Posted on November 3, 2015 at 4:07 pm by lesliemanzara · Permalink · Leave a comment
In: Mobile Technology · Tagged with: , , , ,

Why 5G will be the superfast Swiss Army knife of wireless networks

, CNet News, 11/3/15

Hans Vestberg, CEO of telecom equipment vendor Ericsson, sees the next decade’s 5G networks being smart enough to know what kind of device is using it, and why.

When it comes to the wireless networks of the future, speed won’t be everything.

The advent of so-called 5G, or fifth-generation, wireless technology will bring incredible speed, for sure, with the industry aiming to see your network connection jump by 100 times. (Yes, 100.) More importantly, the network will be smart enough to act differently depending on how it’s accessed, whether from a heart monitor when you’re relaxing at home or from a self-driving car zipping down a crowded highway.

That’s according to Hans Vestberg, CEO of Ericsson, one of the world’s largest suppliers of telecommunications equipment.

His comments provide a glimpse into what tomorrow’s wireless network will look like. While carriers around the world are still deploying 4G networks, which have brought broadband speeds over the air, there’s increasing chatter about what’s next. In the US, Verizon Wireless has already said it plans to field-test its own take on 5G next year, and the industry is starting to talk about the new kinds of devices and connected services that will spring from the technology.

“Many industries will look at how 5G will transform their business,” Vestberg said in an interview on Friday. “It’s my job to build a network to handle that.”

As fast as Verizon is moving, the industry isn’t expected to invest in the technology in earnest until 2020. The speed and capacity 5G brings could offer a legitimate alternative to the physical connection available via Internet service providers and companies such as Google, which use fiber optics to deliver super-high speeds. 5G is supposed to be even faster.

Depending on the device, 5G may have a range of behaviors, he said. The network has to be responsive enough to tell a self-driving car where to go and how to react to situations that require a split-second reaction. It has to be consistent enough to maintain a connection with a hypothetical chipset in your body that can monitor your vitals, but know to instantly ping emergency services in case something goes wrong. It also has to operate efficiently enough that farms can use sensors that can ping the network for 10 years on a single charge.

In other cases, 5G wireless technology may replace the broadband service coming into your home via wires or cables, Vestberg said. It’s already happening with 4G in some parts of the world, but 5G adds higher speed and capacity. With 5G, carriers could also deliver super-sharp 4K video to the home.

One of the reasons Verizon is holding its test so early is to figure out what kinds of applications can take advantage of 5G, the New York-based telecommunications company said in September.

While 5G may bring many things, it’s unlikely that unlimited data will be one of them. In the US, Verizon and AT&T have already eliminated their options for an all-you-can-eat data offering, while T-Mobile and Sprint have limits in place for excessive users. The curtailing of unlimited plans has more to do with economic issues than technical ones.

Vestberg declined to comment on the plans of his carrier customers, but noted that there was a cost to building out these networks, with players such as AT&T projected to spend roughly $10 billion this year. He also warned that as capacity and speeds have increased, so too has usage.

Ericsson could use the boost in equipment sales. The Swedish company posted a third-quarter profit that fell below expectations as carriers around the world slowed down their network deployments.

Posted on November 3, 2015 at 4:00 pm by lesliemanzara · Permalink · Leave a comment
In: Mobile Technology · Tagged with: , , , , ,

The Android smartphone that promises you’ll never run out of storage is available to pre-order

Ben Woods, TNW. 10/22/15

Deleting photos, apps and other random downloads you’d forgotten about might be par for the course for smartphone owners (pretty much everyone, then) but Nextbit’s cloud-first ‘Robin’ device is now available to pre-order and promises that deleting your data to free up space is a thing of the past.

The device had previously been available via Kickstarter for $299 (for anyone quick enough to snap up an Early Bird deal) but now costs $399 to pre-order, with delivery pencilled in to begin in February next year, so don’t go expecting one in time for Christmas.

The handset ships with a modified version of Android, 32GB of internal storage and an additional 100GB of space in the company’s cloud but the smart bit is the way in which Nextbit says it will manage that space by automatically moving rarely-used files from internal to cloud storage. It also says that its ‘cloud-first’ approach to mobile is starting with storage, but that other services will continue to be added over time.

The toughest part for the  company will be convincing fickle smartphone buyers to take a punt on a new brand in a market that contains rivals like the HTC One A9 and Nexus 5X, but with nearly $1.4 million raised on Kickstarter and another $100,000 from its pre-order sale on Backerkit, there’s clearly some demand for the device.

Posted on October 22, 2015 at 7:56 am by lesliemanzara · Permalink · Leave a comment
In: Android, Mobile Technology

4G Flaw Affects All Android Phones on AT&T, Verizon

Tom’s Guide / Marshall Honorof, Yahoo!News, 10/22/15

The vast majority of security flaws can only affect you if your software is out-of-date, or if you neglect to install a security suite. Once in a while, though, one crops up that blasts users across the board.

Such is the case with a newly discovered cellphone flaw. The Computer Emergency Response Team (CERT) at Carnegie Mellon University in Pittsburgh, sponsored by the Department of Homeland Security, is warning that all Android phones on AT&T and Verizon Wireless are currently open to attack over the Long Term Evolution (LTE) 4G network, and for the moment, there’s not much the average consumer can do about it.

Information about the LTE flaw initially came from a team of South Korean security researchers. The researchers’ technical paper, entitled “Breaking and Fixing VoLTE: Exploiting Hidden Data Channels and Mis-implementations,” first saw the light of day at the 22nd ACM SIGSAC conference in Denver earlier this month. The takeaway is that calls made over an LTE network could theoretically make a phone susceptible to data theft, phone spoofing and unauthorized calls.

If you’re interested in exactly how the process works (and have the technical chops to parse it), the research paper explains in great detail how LTE networks can use a new telephony protocol called voice-over-LTE (VoLTE). As 4G/LTE networks become more common, more and more phone calls use VoLTE rather than traditional telephone-network protocols.

Essentially, pre-LTE cellphone calls worked much the same as their land-line counterparts have since the 19th century. Two parties would connect directly to each other using a dedicated temporary connection, or circuit, provided by the telephone network. Signals, whether analog or digital, would travel directly between them along that single connection without interference from a third party. Some LTE voice networks still use this model.

VoLTE is very different and uses packet-switching, which transmits small bits (or “packets”) of data across a large network made up a theoretically infinite number of connections — i.e., the Internet. Each data packet “knows” where to go, and are reassembled into a data stream — in this case, sound — at the destination. Almost every piece of data transmitted across the Internet follows such protocols.

However, moving to packet-based switching opens up voice calls to a huge array of Internet-based attacks that cellular carriers, accustomed to the built-in insularity of circuit-based switching, might not have anticipated. As VoLTE packets travel over the Internet, third parties can access these packets by using sophisticated techniques described in the research paper.

To put it simply, a technically-minded cybercriminal could override call permissions, horn in on private calls, steal a phone number for his or her own purposes or even hack into a user’s phone directly. From there, installing a malicious Android app on a targeted phone would be trivial, further opening up the phone for text-message scams, phishing or whatever else could turn a profit.

There is some good news, however: The issue appears to be exclusive to Android phones on the AT&T and Verizon networks. T-Mobile users (and, by inference, MetroPCS users as well) were affected when the paper was written, but T-Mobile told ZDNet that the issue had been “resolved.” (Sprint has not yet launched VoLTE service.) Apple devices on any network are unaffected.

This patchwork of vulnerable and immune systems suggests that both Google and the wireless carriers can patch the issue — and should probably do so sooner rather than later.

It’s worth noting that there’s currently no reason to think that attackers are using these techniques in the wild, although the paper may inspire some to try. At present, Carnegie Mellon CERT is “unaware of a practical solution to these problems.”

Tom’s Guide has one suggested solution, although it’s not ideal and will not work on all Android phones. Go into Settings, select Cellular Networks or Mobile Networks, then Preferred Network Type. If there’s an opportunity to switch from LTE to 3G, CDMA or GSM, do so. (Not every phone has this option.)

3G networks are apparently unaffected by the LTE issue, although they’re not ideal for processing modern sites and apps; phones have come a long way in the past few years.

Posted on October 22, 2015 at 7:52 am by lesliemanzara · Permalink · Leave a comment
In: Android, Mobile Technology · Tagged with: , , , , , , , , ,