Thursday, February 18, 2010

Watching the creative destruction of the mobile industry at MWC

The mobile device and infrastructure industries continued their familiar yet increasingly complex dance at this week's Mobile World Congress in Barcelona: Consumers and enterprises receive ever more devices to choose from, while carriers scramble to figure out how to support, deploy and make money off the mix.

Check out our slideshow of innovations unveiled at Mobile World Congress 2010. 

New devices with yet more operating systems are aimed at creating a class of inexpensive smartphones designed for those still vast audiences that are not using either a BlackBerry, an iPhone or an Android-based handset.

At the intersection of soaring mobile Internet traffic, enabled by ever more sophisticated client devices, and of rising infrastructure investments, in both enhanced 3G and powerful 4G networks, is an industry-wide experiment to create new services along with new revenue models.

"Right now, the service landscape [from which] mobile operators actually are gaining revenue in mobile data is very thin, with most of the revenues coming from data service subscriptions that are flat fees," says Bettina Tratz-Ryan, a research vice president with Gartner Deutschland. "So, in order to justify the LTE deployments, which need more cells than in a 3G network to build out to higher bandwidths, mobile operators need to build out services with greater customer experiences. Those can be defined for instance in terms of quality, flexibility, and blended with social media for consumers or with unified communications for business users."

Windows Phone: risen from the dead?

On the device side, the biggest story out of MWC was Microsoft's radically redesigned Windows Phone 7 mobile operating system. The demonstration was only that – showing the new user interface – but it deftly blends typography and minimalist icon design in an easily navigable arrangement; applications, content, and information clustered in "hubs" that have common organization and navigation themes.

Microsoft deflected all questions about the kernel, new developer tools, Silverlight support, what kind of browser it has, or anything else deemed to be part of the "platform."

Nonetheless, the user interface impressed observers. "[T]his was the radical change for which consumers have been waiting in order to reengage with Microsoft," writes Avi Greengart, an analyst with Current Analysis, a technology advisory firm, assessing the news in an online post. "Windows Phone 7 series is competitive across the board – for entertainment, enterprise use, and personal productivity."

Microsoft didn't directly address changes or improvements aimed at enterprise users. But the demonstration showed the "Office Hub" for the Microsoft Office Suite, including the OneNote note-writing application, and important access to SharePoint, Microsoft's enterprise collaboration, workflow, and document management system. 


But there's still skepticism about whether the OS can catch up to Apple, RIM, and, increasingly, Android. Veteran Microsoft watcher Joe Wilcox wrote in a post that Windows Phone 7 was "dead on arrival." Microsoft has lost too much market share and mindshare, and faces too much successful competition from Apple and Google Android to resuscitate its mobile offering, he argues.

Both ZTE Corporation and High Tech Computer (HTC) unveiled lower-cost smartphones, making use of Qualcomm's silicon and its recently introduced Brew Mobile Platform (Brew MP) operating system, which incorporates its Brew application framework. Among other things, it supports Adobe Flash. Both companies continue to roll out Windows Mobile and Android phones.

The ZTE Bingo will connect via HSDPA, with 7.2Mbps download speeds, has a variety of built-in popular Web services and applications, a 3-megapixel camera, a 3.2-inch touch screen, and A-GPS. The HTC Smart will use HTC's Sense user interface, has a 300MHz processor, the same camera resolution, a slightly smaller screen. Via Brew MP, both can support Adobe Flash. No prices were announced by German mobile carrier Telefonica. The carrier said it will offer HTC Smart "at less than half the cost of smartphones today," according to a company executive. 

Carriers coping with change

The device innovation highlights the struggles mobile carriers continue to have in a rapidly changing industry. Google CEO Eric Schmidt told his MWC audience that Google software engineers now focus development first on mobile platforms, and secondarily the desktop. The reason for that shift points to the tectonic changes taking place in the mobile industry.

With its aggressive expansion of mobile search and applications, and linking these with location services and mobile advertizing, Google has the potential to recreate the mobile industry in its own image, where services are free to the end user and paid for by advertising, according to Jagdish Rebello, director and principal analyst at iSuppli.

"Like the rest of the mobile value chain, Google is actively seeking to uncover new user behavior patterns and to drive social networking services through the promotion of cloud storage and computing, mobile advertising, and a variety of location-based services," Rebello writes. "All of the free Google offerings are driving toward this goal."

In Barcelona, mobile carriers revealed the latest experiments to cope with change. Twenty-four carriers banded together to launch a single, unified platform for mobile application development to compete with Apple's App Store and Google's Android Marketplace. The intent of the Wholesale Applications Community (WAC) is to provide mobile software developers with a cross-platform open programming standard, intended to make it simpler and faster for them to bring applications to market.

But multiple operating systems and application stores will still have to be supported. Telstra, Australia's biggest mobile carrier, plans to create an online "shopping center" where subscribers browse through storefronts to select applications specific to their handsets. "We will build the shopping center environment, which means we won't be bypassed in the value chain," said Telstra CTO Hugh Bradlow at MWC. Telstra isn't currently part of the WAC.

The LTE advance continues

The movement toward a new mobile infrastructure based on the Long Term Evolution (LTE) standard continued:

- There was agreement on a voice-over-LTE specification, part of a push to standardize LTE services and ensure LTE devices can run on different networks.

- China mobile carrier CSL concluded the first phase of a commercial LTE trial in Hong Kong where prototype USB modems reached download speeds of up to 100Mbps. 

- Japan's NTT DoCoMo showed a prototype notebook computer with an LTE modem.

- AT&T recently announced its selection of two base station vendors for its LTE build-out

But these capital investments are not the endgame, analysts say. The key is services, often in partnership with Internet companies. Executives from Facebook made a MWC presentation on their joint experiment with British-based mobile carrier Vodafone UK, noted Thomas Wehmeier, principal analyst with British market researcher Informa Telecoms & Media, in a blogpost. The two companies experimented with offering free access to Facebook for Vodafone subscribers not currently using data services. "They saw an overwhelming success, with 20% of those testing out the trimmed down Facebook service adding data 'bolt-ons' to their monthly [service] plans," Wehmeier writes.

The Skype hype

Verizon announced at MWC that it will bring Skype's VoIP application to Verizon smartphones in March 2010, including a range of BlackBerries, Motorola Droid and Devour, and HTC Droid Eris. Users will need a Verizon data plan (and in all likelihood a Verizon voice plan), and will be able to make and take unlimited, free calls with other Skype users, and use Skype Out to make international calls to any phone at Skype's standard rates. But Verizon has released no other details.

"Those operators wanting to be serious players in the mobile Internet need to embrace openness and they need to allow Internet services on their devices, [and] this includes VoIP," says Drio Talmesio, senior analyst at Informa Telecoms & Media. "Probably the majority of users don't know VoIP, but those who use Skype are attached to it. It's about segmentation. Verizon can both reduce churn and subscriber acquisition costs by targeting customers that use Skype. And they can also increase the uptake of data plans by bundling Skype with specific tariff plans."

But beneath the flash of new user interfaces and consumer VoIP, is one nearly invisible vein of gold: machine to machine communications, a mobile network of things. The market research firm iDate estimated that the global cellular M2M market would grow from $15 billion in 2009 to $19.3 billion this year and then nearly double by 2013 to $37.3 billion.

At MWC, Sierra Wireless unveiled a new modem, the AirLink GL6100, with embedded SIM, for this market served by GSM/GPRS networks. France's Bouygues Telecom is already adopting as part of the "first pan-European pre-paid M2M airtime offer" aimed at industrial, sales and payment and security applications.

And Deutsche Telecom launched a new "competence center" devoted to machine-to-machine wireless solutions, with consulting and deployment services for nine markets, including transport and logistics, vehicle telematics, smart meters, industrial automation control and healthcare.

John Cox covers wireless networking and mobile computing for "Network World." Twitter: http://twitter.com/johnwcoxnwwBlog RSS feed: http://www.networkworld.com/community/blog/2989/feed

source : itnews.com

Continue Reading...

Comcast offers online backup, sharing through Mozy

Subscribers automatically get 2GB of capacity and can buy as much as 200GB

Comcast will bundle online storage capacity from EMC's Mozy division with its broadband service and let residential subscribers share the content they store with friends -- or with the entire Internet. 

With the new Secure Backup & Share feature, users of Comcast high-speed Internet automatically get 2GB of space in Mozy's data centers for backing up any data from their home computers. They can also buy 50GB for US$4.99 per month or $49.99 per year, or 200GB for $9.99 per month or $99.99 per year, the companies announced Thursday. The feature is available now.

Mozy has been offering cloud-based backup capacity to consumers and businesses for several years. Consumers can buy unlimited storage for $4.95 per month. But the new offering with Comcast marks the first time Mozy has let users share the data they have backed up, according to Vance Checketts, Mozy vice president of operations at EMC. The company hopes to extend this capability to its own service as well as to the Mozy-based backup services already offered by Vodafone and China Telecom. 

For many consumers, a service such as Secure Backup & Share could be an obvious way to start protecting their data from disasters and hardware failures for the first time, said Charles King, an analyst at Pund-IT. 

"For most people, backup is something that they plan to get around to, like losing weight or eating right," King said. However, it's more critical than those users may think, he added. "Tens of thousands of hard drives crash on a weekly basis," he said. Ideally, consumers should also back up their data to a hard drive they own, both to have another copy for safety and because downloading 200GB of remotely backed-up content over a home broadband connection would take a long time, he added.

With the sharing feature, users could make any or all of the files they back up available to family, friends or the public. Theoretically, a user could copy 200GB of files onto Mozy and make them all searchable and available on the Internet, Mozy's Checketts said. In reality, users may designate some content as public but share the rest through invitations to specific authorized users, he said. Sharing is a natural extension to Mozy's backup service, and the business issues of adding it would be more significant than the technical ones, Checketts said. 

However, it could be a big undertaking just for Mozy to support the Comcast service. If all of Comcast's roughly 15.9 million high-speed Internet subscribers simply took advantage of the 2GB of free storage, Mozy would need 30 petabytes of storage to accommodate them. Mozy already has more than 25 petabytes of data under management. The company beefed up its data centers in advance of the Comcast offering and can rapidly expand its capacity if needed through contracts with suppliers of bandwidth, storage space and other components, Checketts said.

source : itnews.com

Continue Reading...

Google gets US approval to buy and sell energy

The company says it wants easier access to renewable energy to power its operations

Google has received federal approval to buy and sell energy on the open market, giving it more options for the way it powers its data centers and opening the door to a potential move into the energy-trading business.

Google applied for the authorization last December through a wholly owned subsidiary called Google Energy. The U.S. Federal Energy Regulatory Commission (FERC) approved its application Thursday, granting Google "market-based rate authorization," or the authority to buy and sell energy on a wholesale basis.

"We made this filing so we can have more flexibility in procuring power for Google's own operations, including our data centers," Google spokeswoman Niki Fenwick said via e-mail. 

Data centers are big consumers of energy and Google operates several large facilities around the world -- it hasn't disclosed exactly how many. That makes ensuring a steady supply of affordable energy critical to running its business.

Google has also said it is committed to being "carbon neutral," in part by using as much renewable energy, such as solar and wind power, as possible. "FERC authority will improve our ability to hedge our purchases of energy and incorporate renewables into our energy portfolio," Fenwick said.

She declined to elaborate, but the company told the Wall Street Journal last month that FERC approval would allow it to approach producers of renewable energy directly to buy power for its operations.

The authorization also raises the prospect that Google may start to buy and sell energy as a business. Its application asked that Google Energy be able to "act as a power marketer, purchasing electricity and reselling it to wholesale customers."

Google didn't respond to a question about that Thursday. Fenwick told the Journal last month that the company has "no plans" to become an energy trader or to sell energy services, but she also acknowledged that the company was "not sure" how it planned to proceed.

FERC's notice shows that the California Public Utilities Commission filed a motion to intervene in the application. It was unclear what concerns the Commission had, if any, and a Commission spokesman could not immediately comment.

The approval is effective Feb. 23, as Google requested. The company told FERC it does not own or control any wholesale electrical generation or transmission facilities, so the agency determined that Google does not have "market power" or the ability to create barriers to entry.

source : itnews.com

Continue Reading...

Google fights for ophaned books

Fending criticisms from multiple parties, Google once again made the case for digitizing millions of orphaned books before the U.S. District Court Southern District Court of New York, in a fairness hearing held Thursday.

A total of 27 different parties requested to speak before the court. Five were in favor, including Sony, the National Federation of the Blind and the Center for Democracy and Technology. The rest -- 22 in total -- opposed the settlement, including Amazon, Microsoft, the Open Book Alliance, and the Electronic Privacy Information Center.

Those in favor praised the idea of rendering hard-to-find books in electronic form, because they could be accessible to a much larger group of readers, and not be lost to the ages. 

The objectors, however, voiced strong concerns that the settlement case preempts U.S. copyright law altogether. Others voiced privacy and antitrust concerns.

The court will decide whether or not approve a proposed settlement to a class action suit waged by a number of author groups towards Google, for its actions of scanning out-of-print books.

U.S. District Judge Denny Chin, presiding over the proceedings, said that he would not reach a decision at the end of that day's fairness hearing, given the amount of feedback the court received.

The settlement, reached in October 2008, came out of a 2005 lawsuit brought about by the Authors Guild, the Association of American Publishers and other groups of concerned writers and content producers. 

The groups expressed outrage that Google was scanning millions of books, an act which they felt violated U.S. fair use rights. The company was planning to offer snippets of the books as part of their search results.

The resulting settlement allows Google to scan books that are still in copyright yet are out of print, provided that it sets up a registry of authors and book titles, and makes an effort to notify authors of these books that their works are being reused. 

In exchange, the company can then offer snippets or even fully downloadable versions of the books for a fee, from which they would pay the authors a percentage of the profits. Authors would be free to opt out of the program. 

Reacting to U.S. Department of Justice antitrust concerns, the parties revised the settlement and resubmitted it to the court in November, narrowing the scope of the agreement to U.S. books.

Despite the revisions, the Justice Department voiced concerns. Deputy Assistant Attorney General William Cavanaugh, who argued the settlement's opt-out approach preempts copyright law insofar that copyright law grants the copyright holders full control over how their works can be published. 

The proposed settlement "eviscerates the right to prior approval," Cavanaugh said.

This view was echoed by others. The settlement, in effect, allows Google to continue to infringe copyright law in the future by not obtaining prior approval, said David Nimmer, a representative for Amazon.

Google attorney Daralyn Durie argued that the opt-in approach would not work for the company, and so it is a non-negotiable part of the settlement.


Her reasoning was that Google cannot tell which of the out of print books will prove to be popular once made digitally available again, so finding each author and persuading him or her to be opt-in would be too cost-prohibitive, she said. 

She added that Microsoft tried this opt-in approach and has since given up on its efforts.

Durie estimated that there are about five million books in U.S. libraries that are out of print but still under copyright. In many cases the authors cannot be located, making them orphaned books. 

A variety of other criticism were also raised: Thomas Rubin, a Microsoft intellectual property strategy attorney, noted that the settlement gives Google an unfair advantage within the search industry, as it hands the company rights to digitize up to 147 million out-of-print books for its own search results, while other search companies would still need to procure the reproduction rights on a case-by-case basis.

Representatives from the Electronic Frontier Foundation (EFF), the Center for Democracy and Technology (CDT) and the Electronic Privacy Information Center EPIC) all voiced concerns at how Google could track what books people read, right down to the particular page numbers. 

U.S. libraries have been fierce protectors of people's privacy in terms of not divulging what books their patrons check out, said Marc Rotenberg, executive director of the Electronic Privacy Information Center. Because Google is a commercial enterprise that makes money by profiling users for advertising, readers could not expect the same level of anonymity, he said.

While the CDT and EFF offered a number of suggestions of how Google could put privacy controls into place that alleviate these concerns -- such as limiting the time Google would hold onto the tracking data -- Rotenberg maintained that the conflict-of-interest would just be too great to mitigate.

Not all companies were opposed to the settlement. Janet Cullum, representing Sony, said that the proposed registry would open a wider array of material for the electronic book market, being as how the registry will allow companies other than Google to track down authors and make their own arrangements.

Marc Maurer, president of the National Federation of The Blind argued that the settlement would be "good news for the blind," insofar that it could make a vast number of previously unavailable texts accessible, through the use of assistive technologies. This is a market now only served partially by the commercial market, he said.

Paul Courant, a librarian for the University of Michigan, noted that the digitization process could preserve countless academic and historic texts that are in fragile states and only available in a few libraries.

Cavanaugh and others agreed that digitizing such books would be a good thing, though ultimately it would be up to Congress to amend copyright laws to make provisions for preemptive projects such as Google's, rather than it being handled in a class action lawsuit settlement.

"You cannot use procedural rules to modify rights," he said.

source : itnews.com

Continue Reading...

Kneber just another botnet?

The Kneber botnet, so christened by security firm NetWitness in describing it to the press, is nothing new and there are many other botnets like it out there, according to a number of other security firms.

The Kneber botnet revealed

Kneber is described as a botnet command-and-control system based on the ZeuS Trojan, a well-know type of malware capable of stealing financial data and login credentials. According to NetWitness, the firm discovered Kneber in January while deploying its network security equipment for a customer, and estimates the botnet has infiltrated "75,000 systems in 2,500 organizations around the world." Other security vendors say expect to find another 100 or more ZeuS-based botnets just like it if you go looking.

Many commend NetWitness for uncovering the server cache of information containing stolen password, login and Web browser information related to the Kneber botnet. But there are probably many more Kneber-like botnets out there today, say some, and because Kneber uses the older version of ZeuS, it doesn't even represent the worst it could do.

What NetWitness uncovered in Kneber with 75GB of information on 75,000 compromised machines over 90 days "is above the median size of a data cache," says Don Jackson, security researcher at SecureWorks, noting most botnet caches his firms has uncovered tend to run 10GB of data for about 23,000 compromised user computers.

But Kneber, he notes, is based on the older 1.2 version of ZeuS now given away for free and is not usually considered what would be used by a "professional high-dollar operator" who would make a lot of effort to hide behind proxies. "If you wanted to go hunting for these things, you could find them every month," he noted.

The most recent version of ZeuS, version 1.3, which was first seen in November of last year, costs thousands, with even a single module costing $10,000 in criminal circles, according to SecureWorks, which is expected to issue an in-depth report about ZeuS 1.3 next Monday. The new version of ZeuS is so deadly, it rips through unauthorized online wire transfers once it gets hold in an infected machine -- and more.

Anyone investing in ZeuS 1.3 is likely to take a lot of trouble to successfully hide the botnet, Jackson notes.

The problem is that Kneber-like botnets are a dime a dozen and certainly nothing new, according to other security firms.

"We're tracking, at any time, about 100 unique ZeuS botnets," says Marc Maiffret, chief security architect at FireEye. "There are constantly-changing variants of it." This is this is one reason it has a chance to evade signature-based malware defenses. But Maiffret also says that he'd characterize a ZeuS botnet controlling 75,000 systems as being of mid to high size.


The ZeuS Trojan "is not a new threat. It's a threat that's been around for a few years," says Elias Levy, senior technical director of Symantec's security-response group, who characterized the Kneber botnet as of a fairly "normal" size. He says these type of botnet infections typically reach into the tens of thousands and it not's surprising to see hundreds of thousands botnet-controlled machines. But he commends NetWitness for gathering "pretty good intelligence; they got a glimpse of how it worked," but adds, "but that one botnet, it's not that much different than the many others out there."

McAfee also piped in, issuing a statement saying, "In the world of cybersecurity, the 'Kneber' botnet is, unfortunately, just another botnet. With 75,000 infected machines, Kneber is not even that big, there are much bigger botnets."

In describing Kneber, NetWitness noted that several of the infected machines it tracked appeared to also be infected with Waledec, another form of malware-based botnet, which it surmised may be there for purposes of "resilience and survivability" by the criminal attack group. But SecureWorks researcher Don Jackson said he wouldn't be inclined to characterize seeing ZeuS and Waledec on the same machine as necessarily representing something unusual since Waledec is often used as a downloader for ZeuS.

source : itnews.com

Continue Reading...

Wednesday, February 10, 2010

HBGary releases Aurora detection tool

Aurora Inoculation Shot will remove malware that steals corporate data

Security vendor HBGary has released a free software tool that can remove "Aurora" malware, linked to corporate espionage at more than 30 companies.

Called the Aurora Inoculation Shot, this utility will remotely scan Windows machines over the network for signs of Aurora and can remove the malicious software as well. It uses the Windows Management Instrumentation services to carry out the inoculation.

Although Aurora has been linked to attacks on just 34 companies, the software has captured the attention of corporate executives, because some believe that is connected to a widespread industrial espionage campaign originating from China.

Last month, Google admitted that it had been hacked by Aurora software and the company's security team gained access to a command-and-control server that held data linking the attack to other major companies such as Adobe Systems and, according to reports, Symantec, Juniper Networks, Northrop Grumman and Dow Chemical.

Security experts have now identified a dozen other Aurora command-and-control servers that may be collecting data on other companies, but many of those servers are hosted by ISPs that have not cooperated with investigations.

At this point, experts are divided on whether Aurora is important because it represents a widespread campaign, possibly condoned or even sponsored by the Chinese government, or because Google took the unusual step of admitting that it had been hacked.

According to HBGary CEO Greg Hoglund, the Aurora malware is similar to many other programs that have been used by criminals for years now. "The Aurora stuff isn't that complicated," He said. "It smells like any other criminal malware that's out there."

Although Google made the Aurora hack a point of negotiation with the People's Republic of China, "there's no hard evidence anywhere that shows that China's government has anything to do with it," Hoglund said.

Despite all the attention Aurora has received, the problem "hasn't gone away," Hoglund added. "It's still out there and operating."

That's why HBGary has made the inoculation software available. The company has also released a report outlining what is publicly known about the malware. "We're the first ones to release a concise report that brings all the data to one spot," he said.

source : itnews.com

Continue Reading...

Google Buzz criticized for disclosing Gmail contacts

Some default settings mean third parties can see who Buzz users have been e-mailing

One day after its launch, privacy concerns have been raised about Google's new Gmail-based social-networking tool, Buzz.

At issue is a feature that compiles a list of the Gmail contacts who users most frequently e-mail or chat with. Buzz automatically starts following these people and makes the list public, meaning strangers can see who Buzz users have been in contact with.

The issue was noted by the Silicon Alley Insider on Wednesday. "Imagine ... a wife discovering that her husband emails and chats with an old girlfriend," the Web site said. "Imagine a boss discovers a subordinate emails with executives at a competitor."

There are some mitigating factors, however. Buzz only shares information about other people who are using Buzz and have set up public profiles in Google. So currently, most Gmail users are not publicly listed by the service. Users can also "unfollow" people who they don't want to be linked to. 

And while Buzz requires users to set up a public profile before they can post messages, it does give them an option to hide who they are following and who is following them. 

However, the default setting is to make the information public, and only users who click on an "edit" tab can see the choice to opt out. That means many people who start using Buzz may be publicly linked to other users without realizing it.

Reached Wednesday afternoon, a Google spokesman had no immediate comment.

Google introduced Buzz as an alternative to popular sites such as Facebook and Twitter, which are increasingly being used to navigate the Web.

source : itnews.com

Continue Reading...

Simulated cyber-attack to test government response

Cyber ShockWave test involves former administration staff, national security officials

Security industry analysts and lawmakers will get an unprecedented chance next week to evaluate how the government might respond to a cyber-attack on critical infrastructure targets.

The Bipartisan Policy Center (BPC), a Washington-based non-profit established in 2007 by several lawmakers, will host a simulated nation-wide cyber-attack next Tuesday for a group of former administration and national security officials, who will be playing the roles of Cabinet members.

The goal of the simulation, called Cyber ShockWave, is to see how officials in key government positions would react to a real-time cyber- attack, and to evaluate the split-second decisions they may be required to take to deal with it, a BPC alert noted.

Those playing the roles of various cabinet members include former DHS secretary Michael Chertoff, the former Director of National Intelligence John Negroponte, former White House Homeland Security Advisor Fran Townsend and former White House press secretary Joe Lockhart.

The participants, none of whom will have any advance information on the simulated attacks, will be expected to advise the President on the unfolding attacks and craft a response to them. The event is scheduled to take place at Washington's Mandarin Oriental Hotel.

A report in The Atlantic said that a considerable effort is being put into making the exercise as realistic as possible.

A production company has been hired to recreate a White House situation room in the Mandarin hotel, and professional scriptwriters will aid security experts in creating the simulated attack.

The whole exercise itself was developed by former CIA director general Michael Hayden and several others, including former New Jersey governor Thomas Kean and Congressman Lee Hamilton, both of whom were co-chairs of the 9/11 commission. Companies and organizations that are participating in the effort include General Dynamics, Georgetown University and PayPal.

This is not the first time that BPC has organized a similar exercise. In 2007, it hosted Oil Shockwave, an oil crisis simulation, in which nine former cabinet and national security advisors participated. The purpose of that exercise was to explore the economic and national security implications of a prolonged crisis in the oil industry.

This month's planned cyber-security simulation comes amid growing concerns over state-sponsored attacks against critical IT assets. The recent cyber-attacks against Google and more than 30 technology companies allegedly by operatives based out of China have highlighted what many say is the need for a formal U.S. policy for deterring and responding to such attacks.

Jaikumar Vijayan covers data security and privacy issues, financial services security and e-voting for Computerworld . Follow Jaikumar on Twitter at @jaivijayan or subscribe to Jaikumar's RSS feed . His e-mail address is jvijayan@computerworld.com .

Read more about cybercrime and hacking in Computerworld's Cybercrime and Hacking Knowledge Center.

source : itnews.com

Continue Reading...

Macworld 2010 refocuses for new era, without Apple

Attending a Macworld Expo without Apple may seem like going to a rock concert to see the opening acts, but despite lacking the rock star presence, the show will go on nonetheless.

It was just prior to last year's conference, Apple announced that 2009 would be the last year it participated in the annual gathering, which has been held in San Francisco since 1985. This year's Expo, which has been rebranded Macworld 2010, will feature no keynote from Apple executives and no Apple booth on the show floor in Moscone Center's North Hall.

But while Apple's absence will certainly change the dynamic of the event, it hardly means that the show won't go on. There's plenty to do throughout the five-day event, which runs from Tuesday February 9 through Saturday February 13, even without Apple around.

Walking the floor

"It's going to be a smaller show this year," acknowledged Paul Kent, vice president and general manager for Macworld 2010. Last year's exhibition spanned both Moscone's North and South halls. "Many vendors decided to sit on the sideline and see what Macworld without Apple would look like. It's really up to those vendors, going forward."

The exhibition hall, open from February 11 to February 13, remains one of Macworld 2010's strongest draws, with more than 250 vendors, including the Mobile Application Showcase, the largest collection of iPhone developers ever assembled, and the Indie Developer Pavilion, a special area for independent Mac developers of all sizes. Plus, more than 60 vendors are introducing new products at the show. There's no question, said Kent, that the product experience remains a very central part of the show. "The three pillars of Macworld are product discovery, conference education, and the social experience."

Kent freely admits that rebounding from Apple's departure may be a two-year process, but he remains optimistic about the future. "I anticipate the vendors are going to come back in droves."

Naturally, it will help if the conference attendance is high. While the exact figures won't be known until the dust clears, there are more than 30,000 pre-registrations for the event. By comparison, Macworld Expo saw attendance of 45,572 in 2007, and a 10 percent increase over that in 2008. Attendance figures for last year's event, however, were not disclosed. Kent noted that he expects a packed house this year, including the more than 700 members of the media that have registered.

Headline acts

In addition to the exhibition hall, this year's show will have a number of feature presentations that Kent hopes will educate, inform, and entertain attendees. "You don't replace a Steve Jobs keynote," said Kent. "But we do know how to build very content rich events here."

On Thursday, New York Times tech columnist David Pogue will host a session called Late Night with David Pogue, featuring surprises guests and musical performances and noted writer and director Kevin Smith will hold a Q&A session on storytelling, technology, and filmmaking.


On Friday, entrepreneur and former Apple Evangelist Guy Kawasaki will talk to developers about the state of innovation in the tech market; podcaster and pundit Leo Laporte will broadcast live with guests like MythBusters' Adam Savage and The Byrds' Roger McGuinn; and Daring Fireball author John Gruber will discuss the top issues shaping our world.

This year, for the first time ever, the show also extends to the weekend, with a musical performance by artist BT and an event discussing Apple's iPad, led by Macworld Editorial Director Jason Snell.

Besides the feature presentations and exhibition floor, Macworld 2010 features six conferences focused on particular industry trends or classes of users, including a track for IT professionals, day-long symposiums for those interested in business topics like marketing iPhone apps, and hands-on instruction sessions on specific pieces of software.

2010 and beyond

Even with all of that going on, it is still hard to stave off much of the doom and gloom about this year's event, which stems from a sense of history repeating. When Apple announced it wouldn't be returning to Macworld Expo in 2010, it said that trade shows had become a less important part of how it reached its customers, citing its more than 200 retail stores and its popular Website.

Apple also pointed to the fact that that it has slowly drawn back its participation in most trade shows. In 2002, IDG World Expo announced that it would be moving its east coast show from New York back to Boston, which it had left in 1998. Apple's response was almost immediate: it would not participate in a 2004 Boston show. In 2005, IDG World Expo shuttered the east coast show, concentrating on the San Francisco Expo.

Kent said there are differences between the way the east coast show was handled after Apple's departure and the way IDG is handling the San Francisco show. "There's a tacit acknowledgement by us that Apple is not here and that the show needs to be different." We're a little more in touch with our marketplace and our community. We deal with our market in a much more intimate way now."

It's been a hard time for trade shows all around, especially where the Cupertino-based company is concerned. In recent years, Apple has also foregone the National Association of Broadcasters conference, Apple Expo in Paris, and the annual NAMM music show. "Trade shows need to change to provide value," said Kent.

Ultimately, the future of the event will rest upon whether vendors and attendees feel that they've gotten their money's worth. But the value of Macworld Expo exceeds merely what is on offer on the show floor. The opportunities for networking, meeting fellow Mac users, and interacting with developers and experts are unmatched.

Next year's show has already been scheduled in San Francisco, running from January 25 to the 29. "We're committed to moving forward," said Kent. "We've taken a lot of bullets this year, but we've known our path." For Kent, it's not about Apple, it's about the community. "Those who come," he pointed out, "really like the show."

source : itnews.com

Continue Reading...

FontAgent Pro Server 4 released

Insider Software has announced FontAgent Pro Server 4, an upgrade to its server software targeted to the enterprise sector of the font management market. Available now, it features font-usage tracking, live backup, automatic failover server access, Kerberos single sign-on support, and directory services enhancements.

FontAgent Pro Server 4 features a real-time font-usage manager that reveals who has access to libraries and fonts and when they activate and deactivate them. Administrators can use a scripting language to save this data in various file formats for deeper analysis in Excel, report scripts, databases, asset management, and accounting programs. These statistics assist organizations with monitoring their compliance with license agreements, tracking font usage by person or project, and avoiding font-license compliance audits.

This new release includes live backup services for a customer's entire database of fonts, users, and groups to ensure that fonts are continuously available. Administrators can set the frequency and time of backups and can add font-server archiving into current backup scripts and processes. The backup manager archives files to any networked or local drive and provides the tools to easily restore a server to a previous state.

Failover services allow FontAgent Pro clients to automatically connect to alternative servers when their primary font server is unavailable. That feature, combined with the program's server replication function, makes it a good choice for multi-server environments, the company says.

FontAgent Pro Server 4 delivers faster Active Directory, Open Directory and LDAP synchronization with improved support for nested groups, standalone users, keychain-based password protection, and non-standard directory setups. It also includes more flexible user permissions that include group administrators and multi-level roles that can control which users can upload fonts, create users and groups, edit licenses, upload and edit sets, upload and edit font libraries, assign fonts and users to groups, and view font usage information.

When users connect to FontAgent Pro Server 4, it checks whether they have previously validated their login via the Kerberos network authentication protocol. If not, FontAgent Pro asks them to enter their user name and password, and sign-on occurs automatically with no additional configuration.

FontAgent Pro Server 4 works with Mac OS X 10.4 or later and requires 30MB of disk space and 256MB of memory. The program is compatible with a range of Macintosh hardware, from dual G4 Macintosh computers up to Apple's Mac Pro and Xserve multi-processor systems. It does not require dedicated server hardware and lets administrators fine-tune performance to compensate for varying speeds on enterprise networks.

FontAgent Pro Server is $1,695; the upgrade price is $850 for users of Version 2 or later. Each user connecting to the server requires a licensed version of the FontAgent Pro connected client, whose suggested price is $130 per seat. Volume license pricing is available on request. For more information, visit Insider Software online.
source : itnews.com
Continue Reading...
 

Free Online IT Information Copyright © 2009