id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2017-09/1579/en_head.json.gz/26099 | ADVERTISEMENT The Target store near Target headquarters on Nicollet Mall in Minneapolis.
Lou Gerstner wasn’t an IBM lifer when he became CEO; he wasn’t even a technology executive.
Stuart Ramson • AP file, Hubert Joly, who is leading Best Buy’s resurgence, was labeled “unimpressive” by one stock analyst.
GLEN STUBBE • Star Tribune file, Schafer: What Target needs most is ... Article by: LEE SCHAFER
A list of likely candidates seems to appear within hours of a big CEO job opening, always produced with all the care of Capt. Louis Renault’s list of usual suspects in Casablanca.
When Target’s CEO position opened last week with the resignation of Gregg Steinhafel, the first flash of potential replacements included a former vice chairman of Target, the CEO of the Gap and the CEO of an enterprise called Tractor Supply Co.
A couple were retailing executives who made the list just by having grown up in Canada. Perhaps some analysts thought they would know how to get other Canadians to flock to Target’s new north-of-the-border stores.
Target’s board should ignore this list. If the search goes well, the next CEO of the Minneapolis-based company likely won’t be anyone on it.
That’s because the list is dominated by accomplished merchants, and Target doesn’t need another one of those. It needs a gifted leader.
Retail experience isn’t disqualifying, certainly, so long as the qualities of leadership are there.
As for those qualities of leadership, it isn’t a short list. What’s important to remember is that the very best leaders have many of them, none overshadowing the others. That’s the framework of leadership that came from the 19th century Prussian general Carl von Clausewitz.
It may seem odd to consult the nearly 200-year-old thoughts of a German soldier on CEO selection. One reason Clausewitz’s classic “On War” remains well-read today is his insistence that to be a top leader, a single outstanding talent or trait isn’t nearly enough. It could even be dangerous. Leadership requires having a full complement of skills and traits, all working together.
Think about what’s needed at Target. The new boss probably needs to have genuine empathy for the staff, who have been through layoffs and the disappointment of recent setbacks like the underwhelming Canadian launch and a major data security breach.
The next CEO at Target should also be an engaging speaker, in a five-minute impromptu talk in the lunch room or addressing a ballroom full of New York fund managers. And next time there’s a data breach or some other crisis, he or she needs to go in front of the TV cameras and be convincing on what’s being done to fix it.
Admittedly these are not qualities that the soldier-philosopher Clausewitz listed, but even his insistence on courage should mean something to Target’s board of directors. By courage he didn’t just mean charging a row of cannons, he also meant the quality it takes to accept responsibility for mistakes.
What’s most interesting is the way Clausewitz thought about intelligence, another must-have trait. For him it was the brainpower to quickly grasp the best option and choose a course of action, even when three-fourths of what a commander really needed to know was wrapped in a fog of uncertainty.
And it seems fair to say the fog is particularly thick in retailing right now.
The ability to really understand what Target’s customer wants is a critical CEO skill, said Mary Saxon, the leader of the consumer markets practice for the international executive search firm Heidrick & Struggles, speaking in the spirit of Clausewitz.
If that sounds easy or obvious, she added, it’s not. Not when the customer can now shop on an iPhone at midnight.
“Retailing will always be about having the most compelling product,” she said. “But all the rest of it has changed.”
And in a fast-changing environment, conventional thinking risks selecting an outstanding merchant who starts work with a plan for fixing Target — and then gets proven wrong within a year.
The best leader is going to report for work not sure of the right thing to do but will plan to figure it out in a hurry — when so much that would be helpful to know remains lost in the fog.
One of the best examples of that came at IBM more than 20 years ago, as the once-dominant Big Blue was losing billions of dollars and looking for new leadership.
The board’s pick, Lou Gerstner, didn’t just come from outside IBM but from outside the industry. Three weeks into his job, Gerstner presided over his first strategy planning meeting. As he later told a Harvard Business School class, “after eight hours I didn’t understand a thing.”
But Gerstner’s challenge at IBM wasn’t forming a new strategy for selling computers to businesses, although it took months for him to fully realize that. He learned he had to change IBM’s culture.
He needed to get the competitive, goal-oriented individuals who collectively had once kept IBM at the top of its industry to buy into the idea of being collaborators who would achieve their goals as members of a team.
What’s happening at Best Buy Co. looks a little like the IBM story. It’s certainly true that going outside to Hubert Joly, the CEO of the travel and hospitality company Carlson, got the same sort of baffled reaction in 2012 as Gerstner’s appointment as CEO in 1993.
Best Buy’s stock dropped 10 percent that day. No securities analyst called Joly an inspired choice for the Richfield-based company. One just called him “unimpressive.” None of them knew if he had ever worked in a store.
It’s now clear that Joly’s lack of retailing experience hasn’t hurt a bit. His leadership gifts, for communication and strategy development, have been very much on display.
Joly has said he wasn’t looking for a job when he decided to go to Best Buy. He was attracted to the challenge of turning around a $45 billion company many investors had given up for dead.
That’s another trait the old soldier, Clausewitz, noticed in the best generals. Nothing beats hunger for honor and renown as a motivator.
Clausewitz appeared frustrated that this concept so easily got confused with simple glory-seeking — something not admired in the Prussian officer corps or in a corporate boardroom.
But, he wrote, show me a great leader who did not first aspire to do great things.
The Target board should know that its best candidates will have many positive traits — including the fiery ambition to lead Target back into the ranks of America’s most-admired companies.
lee.schafer@startribune.com • 612-673-4302 | 科技 |
2017-09/1579/en_head.json.gz/26119 | Are we heading towards a second era of mass TV piracy?
Dave James
Or is the rising cost of watching your favourite TV worth it?
Shares How much would you pay to legitimately watch all the TV shows you're desperate to keep up to date with?Now that the very best shows are being disseminated between terrestrial, subscription television and over-the-top (OTT) streaming services it's going to cost a fortune if you want to always keep up with the latest top TV.Either that or we return to a time when you gathered around the water-cooler, office kitchen or bar to talk about the latest events in Game of Thrones, and where there was a complicit understanding that most, if not all, of you had torrented the show from some illegitimate site.That's my concern with the growing number of different places - whether traditional pay-TV bods or new contract-free OTT folk - offering gated, exclusive access to their own TV shows.As much as there is evidence that a great number of people have both an OTT as well as a standard TV service - some 57% of UK households and the same in the US - there are going to be far fewer that can sign up to multiple OTT as well as a host of pay-TV subscriptions.FracturedGame of Thrones can be exclusively found on Sky Atlantic here in the UK, The Walking Dead shambles around on Fox, Fear the Walking Dead is tied to AMC from BT, every Marvel show, and possibly all the Star Wars ones too, are on Netflix and the Emmy-nominated Transparent is exclusively on Amazon Prime Instant Video.For all that you'd be looking at paying just over £800 to have the basic Sky package, the cheapest Netflix sub, Amazon Prime and BT TV for a full year.And no one is going to do all that. Surely.So something's got to give - and if you're not interested in getting your broadband from BT then it's going to be that which gets the first chop. Likewise if you're not bothered about next day delivery then you're probably going to ditch Amazon Prime and its £79/year price tag.Those two both demand a full 12 month commitment to their cause - which is conversely what makes Netflix's monthly contract-free service such a good weapon against piracy.You can jump in and out of a Netflix sub, so when your favourite show gets another series released in full on the streaming service you can pick up another month's subscription, get your fix and drop out when you're done.Netflix though is obviously working hard at getting enough content signed up to its service so you don't let your subscription slide.Sky has also been very smart about this with its separate NOW TV packages.Like with Netflix you can come and go as you please - either drop £15 on a Roku-built NOW TV box and a month's entertainment package or just go online via a browser or mobile app with your month pass.It's this sort of effort made by the content providers to make it as easy, and convenient, as possible for the consumer to get the content they want which will keep the piracy levels down and ensure the people making the actual shows get paid.And Netflix especially is specifically targeting piracy.Back in April Netflix's CFO, David Wells explained that "piracy is a governor in terms of our price in high piracy markets outside the US. We wouldn't want to come out with a high price because there's a lot of piracy, so we have to compete with that."But with more and more money spent garnering exclusives for different platforms and services, with the express intention of tying the consumer into their own gated-community rather than maintaining lower levels of piracy, the TV market is going to become ever more fractured.And with that the torrent sites are going to start getting a lot busier again.It could be worse though, if you wanted to watch all the live Premier League and Champions League football there is that's liable to cost you nearer £900 for the year... Apple wants a piece of the TV streaming pie.
Components EditorDave (Twitter) is the components editor for TechRadar and has been professionally testing, tweaking, overclocking and b0rking all kinds of computer-related gubbins since 2006. Dave is also an avid gamer, with a love of Football Manager that borders on the obsessive. Dave is also the deputy editor of TechRadar's older sibling, PC Format.
The best Twitter client 2017
See more Television news Load Comments | 科技 |
2017-09/1579/en_head.json.gz/26130 | Almanac forecast: Wintry weather _ and mystery
DAVID SHARPAssociated Press Published: August 26, 2012 12:16 PM
1 of 1 Photos | View More PhotosLEWISTON, Maine (AP) -- The weather world is full of high-profile meteorologists like NBC's Al Roker and the Weather Channel's Jim Cantore. But the guy making the forecasts for the Farmers' Almanac is more like the man behind the curtain.He's cloaked in mystery.The publisher of the 196-year-old almanac, which goes on sale this week, takes great pains to protect the identity of its reclusive weather soothsayer, who operates under the pseudonym Caleb Weatherbee. Caleb's real name and hometown are a secret. And so is his age-old formula used for making long-term weather forecasts."It's part of the mystique, the almanac, the history," said Editor Peter Geiger of the current prognosticator, the almanac's seventh, who has been underground since starting the job in the 1980s.Even just to speak to the forecaster, the almanac would agree only to an unrecorded phone call with the man from an undisclosed location.[Article continues below] The weather formula created by almanac founder David Young in 1818 was based on planetary positions, sunspots and lunar cycles. Since then, historical patterns, weather data and a computer have been added to the mix.The mystery man's forecast for the coming winter suggests that people from the Great Lakes to northern New England should get out their long johns and dust off their snow shovels because it's going to be cold and snowy. It's also supposed to be wet and chilly in the Southeast, and milder for much of the rest of the nation.In an election season, the almanac dubbed its forecast "a nation divided" because there's a dividing line where winter returns for much of the east, with milder weather west of the Great Lakes.Scientists generally don't think too much of almanac's formula.Ed O'Lenic, operations chief for NOAA's Climate Prediction Center, declined to knock the almanac's methodology but said sun spots and moon phases aren't used by modern-day meteorologists.[Article continues below]"I'm sure these people have good intentions but I would say that the current state of the science is light years beyond what it was 200 years ago," O'Lenic said from Maryland.In this year's edition, the almanac's editors are contrite about failing to forecast record warmth last winter but they suggested readers should go easy on the publication -- and on Caleb -- because nobody forecast 80-degree weather in March that brought the ski season a rapid end in northern New England."Let's face it -- the weather was so wacky last year. It was so bizarre," said Sandi Duncan, managing editor, pointing out that NOAA and Accuweather also missed the mark.Indeed, NOAA and Accuweather didn't project the extent of the warm winter."We missed it too, to put it bluntly," said Tom Kines, a meteorologist at Accuweather in State College, Pa. "It was a weird winter last year."The Maine-based Farmers' Almanac is not to be confused with the New Hampshire-based Old Farmer's Almanac. Both issue annual forecasts, with the Old Farmer's Almanac scheduled for next month.Geiger, who keeps a copy of Weatherbee's secret weather formula in a secure location, is quick to point out that there's more to the almanac than just weather forecasts. Hearkening to its old traditions, the folksy almanac features recipes, gardening tips, jokes, facts and trivia, and a guide to a simpler life.For example, who knew that you could clean your toilet by pouring in Coca-Cola instead of harsh chemicals, or that putting a spoonful of vinegar in a pet's water dish keeps fleas at bay?As for the weather, almanac readers say it's all good, clean fun."It's a fun publication to get and to read, to watch and see how accurate it is," said Wanda Monthey of Alexandria, Va. "It's a lot like a game."___Online:http://www.farmersalmanac.com/___Follow David Sharp on Twitter at http://twitter.com/David_Sharp_AP | 科技 |
2017-09/1579/en_head.json.gz/26192 | Patreon Raises $2.1 Million To Provide Another Fundraising Platform Patreon Raises $2.1 Million To Provide Another Fundraising Platform
Subbable, the Vlogbrothers’ new online video fundraising platform, is making headlines thanks to its high-profile founders. However, it is not alone. Patreon, a site launched back in May, also allows YouTube creators to set up regular fundraising drives to fuel their videos. The brainchild of YouTuber Jack Conte, Patreon has big plans of its own; Conte just raised more than $2 million in venture capital, which will help him bring the system of patronage to the forefront of the online video space.
Like Subbable, Patreon operates on a ‘pay what you want’ system, but it is distinct in terms of the period over which donators contribute to their favorite creators. Instead of a monthly model, Patreon operates on a per-video basis, where ‘patrons’ donate the same amount for each video. This guarantees creators a fixed budget for each video, and monthly maximums ensure that patrons won’t unknowingly donate their whole life savings to a YouTube video.
Conte is best known for being one half of the band Pomplamoose, which creates unusual covers of hit songs for our viewing please. Since finding success with Pomplamoose, Conte has moved on to his more electronic solo work, which he now funds through Patreon. As with the Green brothers, his main purpose for creating the site is to lower reliance on fickle ad rates. “I don’t think ads really properly value content, because they don’t take into account whether a user cares about that content,” he told Allthingsd, “It’s a binary rubric; they’re either watching or not.”
The online fundraising space is crowded, but Patreon has already attracted some notable names, including Peter Hollens, Julia Nunes, and the other half of Pomplamoose, Nataly Dawn. The $2.1 million funding round proves that Conte is serious; there may be a pay-what-you-want online video platform rivalry brewing. | 科技 |
2017-09/1579/en_head.json.gz/26267 | Disney Infinity - Video Review by IGN (Xbox 360 version)
Disney Infinity was released in North America two days ago on almost every platform, including the Wii U. The game features "Play Sets" and the "Toy Box." The Play Sets have specific challenges and puzzles for players to tackle, with unlockable characters, vehicles, and gadgets as rewards. The "Toy Box" allows players to create their own environments which can then be shared with friends.
The Xbox 360 version currently has a Metacritic average of 75 based on reviews from 14 critics. As of this writing, there aren't any reviews of the Wii U version by professional critics, but there are a few customer reviews on Amazon which you might want to check out.
IGN has released their video review of the game, and they highly recommend it. The video features the Xbox 360 version, but the Wii U version should be pretty much the same. You can watch it after the break.
Disney Infinity, | 科技 |
2017-09/1579/en_head.json.gz/26286 | > 3M invests in Smart Energy Instruments
3M invests in Smart Energy Instruments Tuesday, Aug 26, 2014 3M New Ventures – the corporate venture arm of 3M – announced an equity stake in Toronto-based Smart Energy Instruments (SEI), a move that will further accelerate SEI’s efforts in developing electronic chipsets with high-precision, real-time monitoring capabilities for smart grids, as well as give 3M a bigger presence in the energy sector.“We are honored that 3M is joining our team of investors”The investment from 3M New Ventures comes as several existing stakeholders also bolstered their equity in SEI, including Venturelink Funds, ArcTern Ventures and the Ontario Capital Growth Corp. Together, the financing round, led by 3M New Ventures, totaled $5 million (or $5.6 million Canadian dollars). Other details of the transaction were not disclosed.“We are honored that 3M is joining our team of investors,” said SEI CEO Jeff Dionne. “We’ve hit several key milestones in our company’s growth, and the utility market is now primed to take full advantage of our smart grid technologies. We see 3M as an important strategic partner to jointly develop and commercialize innovative solutions for utilities and electrical equipment manufacturers.”SEI’s technology platform is based on a core set of chips with unprecedented measurement precision and low-power consumption that can be integrated into today’s intelligent electronic devices. It gives equipment manufacturers and utilities a huge competitive edge in functions such as grid load monitoring, fault detection and isolation.“SEI’s unique technology will strengthen 3M’s leadership in the energy sector and enable the development of ubiquitous infrastructure sensing and monitoring solutions,” added Stefan Gabriel, president of 3M New Ventures.Aging grid infrastructures and the surge in renewable energy sources – particularly solar and wind – are driving the need for more precise monitoring devices. 3M’s Electronics and Energy Business Group is building new platforms for greater relevance to the energy sector, and SEI’s chips will be a key part of 3M’s portfolio for grid automation.“We are impressed with SEI’s forward-thinking approach to the utility marketplace,” said Robert Visser, vice president of research & development of 3M’s Electronics and Energy Business Group. “We now have the opportunity to support a new standard for multi-dimensional electrical measurements in the utility industry.”According to Navigant Research, cumulative worldwide electric utility spending on asset management and condition monitoring systems will total close to $50 billion from 2014 to 2023.3M New Ventures, the corporate venture arm of 3M, identifies and invests in innovative, high-growth companies working on technologies and business models with strategic relevance for 3M's businesses. With staff located in EMEA, the Americas and Asia Pacific, 3M New Ventures scouts globally for the most promising opportunities and manages a portfolio of over 23 investments to date. For more information, visit www.3M.com/newventures.For more information, please visit: Business Wire Other Renewable News | 科技 |
2017-09/1579/en_head.json.gz/26290 | CST Enhances RCS Simulation Capabilities
5 Stars 4 Stars 3 Stars 2 Stars 1 Star Rome, October 7, 2014, CST - Computer Simulation Technology AG (CST) is showcasing its new features for radar cross-section (RCS) simulation in the upcoming version of its flagship electromagnetic simulation tool, CST STUDIO SUITE(r) 2015, at European Microwave Week (EuMW) 2014, booth 109.
Aerospace and defense companies and government agencies worldwide use CST STUDIO SUITE on mission-critical projects. Its tightly-integrated solvers cover a broad range of frequencies and scales, allowing demanding and complex electromagnetic environments to be simulated during the development phase of new technologies.
RCS is an important consideration when designing new equipment, and is something that CST has long supported. The 2015 release builds on the previous RCS capabilities of CST STUDIO SUITE by adding the ability to produce RCS maps - plots of RCS phase or amplitude against frequency and scan angle - with the shooting and bouncing ray (SBR) asymptotic solver. RCS maps give engineers a more detailed view of the cross-section of a platform, and can be used to help identify potential scatterers.
CST STUDIO SUITE 2015 also introduces support for angle and frequency-dependent radar absorbing material (RAM) coatings. The angular dependency and frequency dependency of the material are important considerations when minimizing the worst case RCS of a platform, and include these properties in the model can make the RCS simulation more accurate.
More information about the RCS capabilities of CST STUDIO SUITE 2015 will be given in the presentation of new features at the CST booth. Registration for the EuMW exhibition is free of charge, and the CST talks are open to all exhibition visitors. For an overview of all booth presentations see https://www.cst.com/EuMW-flyer or visit the CST booth and pick up a leaflet.
"RCS is a significant application for our software ," commented Martin Timm, Marketing Director, CST. "The new features in CST STUDIO SUITE 2015 build on over a decade of experience of RCS simulation, and will help engineers be able to analyze the cross-section of platforms more accurately and more effectively."
CST STUDIO SUITE 2015 is due for release at the end of Q1 2015.
About CST
Founded in 1992, CST offers the market's widest range of 3D electromagnetic field simulation tools through a global network of sales and support staff and representatives. CST develops CST STUDIO SUITE, a package of high-performance software for the simulation of electromagnetic fields in all frequency bands, and also sells and supports complementary third-party products. Its success is based on a combination of leading edge technology, a user-friendly interface and knowledgeable support staff. CST's customers are market leaders in industries as diverse as telecommunications, defense, automotive, electronics and healthcare. Today, the company enjoys a leading position in the high-frequency 3D EM simulation market and employs 250 sales, development, and support personnel around the world.
CST STUDIO SUITE is the culmination of many years of research and development into the most accurate and efficient computational solutions for electromagnetic designs. From static to optical, and from the nanoscale to the electrically large, CST STUDIO SUITE includes tools for the design, simulation and optimization of a wide range of devices. Analysis is not limited to pure EM, but can also include thermal and mechanical effects and circuit simulation. CST STUDIO SUITE can offer considerable product to market advantages such as shorter development cycles, virtual prototyping before physical trials, and optimization instead of experimentation.
Further information about CST is available on the web at
https://www.cst.com.
Ruth Jackson
CST AG
Tel: +49 6151 7303-752
Email: info@cst.com
Web: https://www.cst.com/
"EDA Application Porting Guide: Part 1" by Arman Poghosyan
"New CDC Verification: Less Filling, Picture Perfect, and Tastes Great!" by Graham Bell
Pasternack Debuts New Line of High Power Linear RF Amplifiers
"AtaiTec co-author shows how reducing skew can improve a channel’s insertion-loss-to-crosstalk-ratio" by Ed Lee | 科技 |
2017-09/1579/en_head.json.gz/26312 | ← Microsoft HoloLens gets real with robotics, surgery, architecture
Nepal earthquake destroys Kathmandu valley’s architectural treasures →
Bold buildings are sprouting in areas that have remained barren since World War II
By Dalia Fahmy
Bloomberg, Apr 30, 2015
On a dusty parking lot in central Berlin, where U.S. bombs leveled homes and offices in 1945, Dutch architect Rem Koolhaas is planning a beehive-like digital media center for publisher Axel Springer. A 15-minute bike ride away, on the former site of an iron foundry, architect Daniel Libeskind is completing a titanium-wrapped apartment building crowned with a soaring penthouse. Across the Humboldthafen canal, a new neighborhood with a tree-lined boulevard flanked by shops, homes, and offices is rising where rail-switching yards stood before they were wiped out during World War II.
(Image: A rendering of Axel Springer’s digital center. Source: Courtesy Axel Springer/Oma)
Seventy years after the end of the war, Berlin is finally filling the last gaps left by Allied bombs, which destroyed more than two-thirds of the buildings in the city center. Architects say the construction boom offers Berlin a chance to make up for decades of bad planning and mediocre architecture. “This is a new time in Berlin,” says Libeskind, the Polish American architect who designed the Jewish Museum in Berlin and drew up the master plan for the new World Trade Center site in Manhattan. “It’s one of the great cities of the world, and we expect it to compete. We don’t expect it to be some backwater.” Read more…
Filed under Media Scan | 科技 |
2017-09/1579/en_head.json.gz/26363 | Can fuel suppliers refuse autogas sales for aviation? October 27, 2011 by Kent Misegades In the past three years, the authors of the GAFuels blog have assisted countless pilots, flight schools, flying clubs, FBOs, airport managers and others in the search for suppliers of ethanol-free autogas. If existing suppliers of the other FAA-approved aviation fuels (avgas and Jet-A) would offer autogas, our help would not be necessary. (Why they don’t sell it is a mystery to us — after all, these same companies already produce enormous quantities of autogas, then adulterate it with ethanol before selling it for highway use.)
This forces airports to deal with their local fuel suppliers, often small “jobbers” as they are called, who haul a variety of fuel from area terminals to those who need it. Often, this includes deliveries of ethanol-free fuel to gas stations (best found via Pure-Gas.org), marinas, race tracks, farms, and to various others who prefer ethanol-free gasoline. One such company, Renner Petroleum, hauls the fuel from a terminal in Reno, Nev., all the way to Fortuna, Calif., to be used in commercial fishing boats whose engines do not tolerate ethanol blends. That’s a distance of 334 miles, each way!
On occasion, we hear from someone who has contacted his local supplier of ethanol-free fuel to arrange deliveries to his airport, only to be told that the company will not sell its products for use in aircraft. When probed for a reason, the typical answer has to do with the company’s legal department knee-jerking over their perception of unusual liability related to aviation. Such an attitude is not only ignorant, but ironic, given that fuel suppliers appear to have no apprehension to selling ethanol blends that are known to cause serious damage and safety risks to hundreds of millions of engines. Is there no liability for the risk to the sport fisherman whose engine fails due to ethanol 50 miles offshore with an approaching storm?
Ignorance aside, what about the legality of a fuel producer or supplier agreeing to sell his product to one market but not to another? Autogas has been an FAA-approved aviation fuel since the first STCs were issued in 1982. Since then it has been in daily use in tens of thousands of legacy aircraft with autogas STCs. It is the most appropriate fuel for auto engine conversions widely used in experimental-category aircraft, and it is an approved fuel for nearly all of the latest generation of piston aircraft engines from Rotax, ULPower, Jabiru, Continental, Lycoming and others. As long as the fuel meets the STC or TC requirements, pilots can and should use it when appropriate, and sellers have no solid technical or certificate-related arguments to refuse sales for aviation.
We recently wrote a lengthy article that describes how ethanol-free fuel sold at many gasoline stations complies with all the requirements needed for aviation. Nevertheless, some fuel producers have released statements that imply that their products are not suitable for aviation. Take, for instance, a flyer that the Marathon Oil Company has distributed in recent years. Marathon markets its popular “Recreational Gasoline” to those needing an ethanol-free alternative. In the flyer, the company claims that its gasolines “are manufactured to meet high automotive standards, including those of ASTM D4814.” The flyer then goes on to state “We are aware the Federal Aviation Administration has issued Supplemental Type Certificates approving the use of unleaded automotive gasoline with a minimum (R+M)/2 octane of 87.0 and meets automotive gasoline standard ASTM D4814 in certain small aircraft. Although Marathon’s gasolines may meet these specifications, Marathon does not recommend use of its fuel for aircraft of any type, including aircraft with a Supplemental Type Certificate for use of automotive gasoline.”
Why? That question remained unanswered in the company’s statement. Its authors need to update their information — autogas can power 70%-80% of the entire piston-engine fleet, nearly all new LSAs, and a wide range of warbirds and old cargo aircraft, not, as Marathon’s flyer suggests “…certain small aircraft.”
Marathon and its competitors must manufacture and distribute their products as complying with specific ASTM designations and provide disclosure in order to sell it. Proof of compliance may be found in the Bill of Ladings provided to the fuel jobber when a load is picked up at a terminal. If a product does not comply, the supplier is mislabeling their product and can be fined or sued by consumer protection agencies, not to mention customers who relied on their labeling, regardless of warnings. Since aircraft STCs and TCs approve the labeled products, there is no special liability for the seller simply because the fuel is sold for aviation.
What about laws that prevent a producer from prejudicial sales, i.e., selling the same product to one individual but denying it to another, although both have a legitimate reason for purchase? It is clear that if a company offers a product for sale across state lines, as most gasoline producers do, then they are subject to federal and state trade laws.
USLegal.com describes restraint of trade as follows: “Restraint of trade means any activity which tends to limit trade, sales and transportation in interstate commerce or has a substantial impact on interstate commerce. Antitrust law prohibits most of these types of practices. The main antitrust law is the Sherman Act. To prevent trusts from creating restraints on trade or commerce and reducing competition, Congress passed the Sherman Antitrust Act in 1890. The Sherman Act aims to eliminate restraints on trade and competition.States also have laws against restraints of trade that have strictly local impact. ” When a producer refuses to sell his product to one group, but agrees to sell it to others, this reduces competition as the party excluded from purchase has fewer options to buy the same product elsewhere.
Well, there you have it. Not only are fuel suppliers who refuse to sell autogas for aviation purposes ignorant of it being an FAA-approved aviation fuel for nearly three decades, they are likely in violation of federal and state laws. If you are on the receiving end of such a refusal, cite the Sherman Antitrust Act and ask what gives this company the right to ignore it.
On a related matter, some airports have attempted in the past to prevent pilots from fueling their own aircraft, citing restrictions in their own “Master Plans.” The lack of self-service fueling generally leads to higher costs for FBOs, limitations on when fuel is available, and almost always to higher costs of fuel for pilots. The FAA, however, includes fueling as one of the preventative maintenance activities we’re allowed to perform on our aircraft, according to Title 14 of the Code of Federal Regulations. Once again, when confronted with what seems to be an unusual restriction on the use of your airplane, ask to see the federal or state law that makes this so. As they say, “forewarned is forearmed.”
The GAfuels Blog is written by two private pilots concerned about the future availability of fuels for piston-engine aircraft: Dean Billing, Sisters, Ore., an expert on autogas and ethanol, and Kent Misegades, Cary, N.C., an aerospace engineer, aviation sales rep for U-Fuel, and president of EAA1114.
Kent Misegades
Kent Misegades, Cary, N.C., a pilot since age 15, aerospace engineer and homebuilder, started his career fueling aircraft at a large FBO in Louisville, Ky.
F1boss says November 29, 2011 at 1:57 pm Kent:
If you have a list of suppliers, that sure would be a handy reference to have on hand here (central Texas). I appreciate any help you can lend!
Kent Misegades says October 29, 2011 at 4:18 pm Danrh, the answer may be seen in recent developments: Tecnam, the world’s #1 light plane producer, has an 100% autogas fleet. Lycoming and Continental announced new autogas-burning engines this year. The AIr Plains / Petersen ADI water/methanol system has been reintroduced, allowing high compression engines to run on autogas. 94UL is now becoming available across Europe. Avgas consumption dropped by another 5% in the past year. Who would want to produce a product for a market that is in decline?
Danrh says October 29, 2011 at 3:31 am I just want to know why is it taking so long to bring Swift fuel to the market place?
Ethanol in gasoline is a joke and must be stopped. I get irate every time is see the sign on the pump “ENHANCED” with Ethanol. Getting screwed and lied to at the same time. | 科技 |
2017-09/1579/en_head.json.gz/26419 | Gartner: Enterprises Are Getting More 'SaaS-y'
By Jim Barthold12/04/2008
Enterprises are starting to appreciate software as a service (SaaS) and plan to maintain or grow their SaaS use, according to a survey conducted by Gartner, tapping 258 respondents. "Nearly 90 percent" of organizations "expect to maintain or grow" their SaaS use, Gartner found. The survey polled individuals responsible for implementing enterprise software in organizations. It was conducted across eight countries in June and July. The results indicated that SaaS is increasingly seen as a way to offload on-premises software systems to save money and drive change within corporate environments. While that sounds positive, SaaS is "not good for all things," said study author Sharon Mertz, a research director at Gartner. SaaS is not the appropriate choice for organizations with security policies prohibiting off-site access, she explained. "It's also not appropriate for all applications," Mertz said. "If you have a very complicated business process that is tightly integrated with on-premises systems or other back-end applications, it may not be sensible to try to extract a piece of that and use it as software as a service." Increasingly, though, SaaS, unlike its ASP predecessor, seems to be arriving at the right place at the right time. Greater access to broadband connections worldwide is helping the case for SaaS delivery. Also, SaaS vendors have been carving out niches for themselves by delivering customized software to meet specific corporate needs.
"It's becoming a more accepted way to handle your computing requirements," Mertz said. "There is less concern than there was in the past with uptime [and] less concern with security." There's also less resistance to SaaS from IT departments, she added, since "we found in the survey that a lot of the decisions on SaaS were joint decisions between the business and IT."
Size matters when it comes to SaaS. Small-to-medium businesses lacking IT resources may consider using such services. It's also gaining some traction with larger enterprises that traditionally have run computing resources at their own premises.
For SaaS adopters, a big decision is which platform to use. Providers include Microsoft, Oracle, SAP, Force.com and others, Mertz said. Microsoft sells its hosted solutions directly while also relying on its partner network to deliver customized solutions.
Some critical computing applications probably will never leave the corporate confines. For those considering SaaS, the key is accepting that not every software application will require on-premises hands-on attention, Mertz said.
"You don't want your IT guys to be doing things that can be more cost effectively handled by a service like SaaS," she concluded.
Among the North American survey respondents, 62 percent expected to see "slight increases" in new SaaS investments and 15 percent expected to see "significant increases." Those numbers were 49 percent and 15 percent in Europe, and 55 percent and five percent in the Asia-Pacific region, respectively. The survey also found that 37 percent of the respondents were transitioning from an on-premises solution to SaaS. SaaS growth will depend mostly on future improvements in SaaS technology platforms. "It's not just something like expense reporting, which has been available as a service for years," Mertz said. "It's something that's more germane to the business process and they're going to need to have it integrated with their backend systems and they're probably going to want more functionality. That is one of the things that the survey validated." About the Author
Jim Barthold is a freelance writer based in Delanco, N.J. covering a variety of technology subjects. Printable Format | 科技 |
2017-09/1579/en_head.json.gz/26570 | You are here...AboutMedia NewsroomPress Releases ShoreTel President and CEO Announces Retirement
SUNNYVALE, Calif., May 10, 2013 – ShoreTel® (NASDAQ: SHOR), the leading provider of brilliantly simple unified communications platforms including business phone systems, applications and mobile UC solutions, today announced that Peter Blackmore, president and CEO, has informed the company of his intention to retire as soon as a successor is announced.
“It has been a privilege to work with the ShoreTel team,” said Mr. Blackmore. "I am confident that the company is well-positioned in both the cloud and premise unified communications markets. With recent changes to our organizational and cost structure, we can now scale with a model positioned for profitability, enhancing our objective of delivering improved shareholder value.”
ShoreTel board member Chuck Kissner, who was recently appointed Chairman said, “Under Peter’s strong leadership, ShoreTel has nearly doubled its revenues, taken a leadership position in cloud-based business communications and consistently grown its premise-based IP business phone system share of the market.”
“We are grateful that Peter will continue in his role during this transition period. We intend to keep moving full speed ahead with our near-term plans to launch new innovative products and services and to continue to execute on our combined premise and cloud strategy,” Kissner added.
A search for a new CEO is underway.
Legal Notice Regarding Forward-Looking Statements
ShoreTel assumes no obligation to update the forward-looking statements included in this release. This release contains forward-looking statements within the meaning of the "safe harbor" provisions of the federal securities laws, including, without limitation, future outlook. The forward-looking statements are subject to risks and uncertainties that could cause actual results to differ materially from those projected. The risks and uncertainties include the intense competition in our industry, our reliance on third parties to sell and support our products, our ability to grow our ShoreTel Sky business, our ability to maintain our premise business in a profitable manner, supply and manufacturing risks, our ability to control costs as we expand our business, increased risk of intellectual property litigation by entering into new markets, our ability to attract, retain and ramp new sales personnel, uncertainties inherent in the product development cycle, uncertainty as to market acceptance of new products and services, the potential for litigation in our industry, risks related to our acquisition of M5 Networks, including technology and product integration risks, our ability to attract and retain key personnel and customers and the risk of assuming unknown liabilities, and other risk factors set forth in ShoreTel's Form 10-K for the year ended June 30, 2012, and in its Form 10-Q for the quarter ended December 31, 2012.
Download PR_05-10-13_Peter_Blackmore__final.pdf
About ShoreTel, Inc.
ShoreTel, Inc., (NASDAQ: SHOR) is a leading provider of Pure IP unified communications solutions. ShoreTel enables companies of any size to seamlessly integrate all communications - voice, video, messaging and data - with their business processes. Independent of device or location, ShoreTel's distributed software architecture eliminates the traditional costs, complexity and reliability issues typically associated with other solutions. ShoreTel continues to deliver the highest levels of customer satisfaction, ease of use and manageability while driving down the overall total cost of ownership. ShoreTel is headquartered in Sunnyvale, California, and has regional offices in Austin, Texas, the United Kingdom, Sydney, Australia and Munich, Germany. For more information, visit www.shoretel.com or call 1-800-425-9385. | 科技 |
2017-09/1579/en_head.json.gz/26587 | Combating climate change by storing CO2 underground
PRI's The World September 30, 2009 · 4:25 PM CDT facebook Share on Facebook
Comment iceland_co2_100963680.jpg
Ashley Ahearn reports for PRI's "The World."Player utilitiesPopout
In the past year Iceland has been in the news for all the wrong reasons. First the banks collapsed, then the economy, and then the government. But here’s something that survived -- a research project aimed at removing excess carbon dioxide from the atmosphere and storing it beneath the earth’s surface. CO2 emissions contribute to climate change and rising sea levels and many countries, including the US, are investing millions to develop so-called CO2 sequestration technology. The project in Iceland is especially promising.
Driving southwest from Reykjavik, Iceland’s capital, the land looks like crumbled Oreo cookies – miles and miles of them strewn beneath a steel-gray sky. The volcanoes that formed Iceland spewed out hot magma which cooled into a porous black rock known as basalt. Iceland is 90% basalt and scientists here think all that rock might be as good as gold in the fight against global warming. The key – like so much else on this volcanic hot spot – may lie underground.
"Actually they’re drilling and tapping off energy from the volcano and the heat has been here for thousands of years and we can expect the heat to be here for other thousands of years," says Almar Sigurosson. He works forks for Reykjavik Energy,the company that taps the steam and boiling hot water that surged through the rocks beneath this volcano to produce enough electricity for two thirds of Iceland’s population. He continues, "This is bore hole that goes down three kilometers. And from these bore holes, there comes mixture of steam and water and the steam then goes to the power plant and turns the turbines."
This is the kind of clean geothermal energy that environmentalists love. But along with the steam this plant also releases a small amount of naturally-created carbon dioxide, a key greenhouse gas. That led company officials to start thinking about how they might be able to capture and store that CO2 to keep it from adding to the global warming problem. The answer, they realized, might lie in the very rock that harbors the steam and hot water – Iceland’s ubiquitous basalt.
"When these types of rocks are exposed to air then they react with the air and with rainwater and the process is called weathering. And the minerals – they get decomposed because of that weathering process," says Juerg Matter. He’s a geologist at Columbia University’s Lamont-Doherty Earth Observatory. The weathering process he’s describing is important because basalt doesn’t just decompose. The minerals in the basalt bond with CO2 to form new carbonate rocks transforming the CO2 from a gas to a solid in the process. The reaction is exciting for scientists and CO2 emitters alike because it keeps CO2 out of the air virtually forever. And there’s more good news. Basalt is the most common rock on earth. Juerg Matter says there’s potential to capture a significant amount of CO2 by pumping it from power plants into underground deposits of basalt around the world.
"We could basically sequester – the estimates are – a billion tons of CO2," he says.
The project backers say the potential could be much greater. The reaction between basalt and CO2 happens extremely slowly in nature. But Matter hopes the deep injection process will help speed it up. That’s where the geothermal plant in Iceland comes in. It’s the first real-world test site of this sequestration concept. Holmfriour Siguroardottir works in Reykjavik Energy’s Innovations Department which has partnered with Juerg Matter and other scientists on the project.
"Right now we are approaching the injection wells where we have been doing all this preparation work. You can’t so much on the surface. Everything is happening below the surface."
A couple of white tubes stick out of a patch of brown earth – only a hint of the giant science experiment that’s about to begin beneath our feet. Starting this weekend Siguroardottir says her company will begin pumping a mixture of ground water and CO2 from the geothermal plant 600 meters down the white tubes into the porous basalt below.
"That will seep into the basaltic rock where it will react with the minerals in the rock and we are aiming at forming carbonates – carbonate minerals – where it will fixed. It will be there as a mineral not as a gas," says Siguroardottir.
That is important because the big criticism of other carbon sequestration schemes is that the carbon would be stored as a gas which could escape back into the atmosphere. About 200 meters away from the injection tubes two other bore holes will allow researchers to take samples of groundwater downstream. They’ll analyze the water to find out how much CO2 has bonded with the rock; how much new rock is being produced; and how fast the reaction is taking place.
"So that’s a major step in the right direction – to be able to actually monitor what’s going on in time," according to Gordon Brown, a geology professor and Stanford University. He’s been studying the problem of CO2 sequestration for 40 years. That’s longer than most scientists have even recognized that rising levels of atmospheric CO2 are in fact a problem. But Brown takes heart in the possibility that the Iceland research could lead to a breakthrough.
"It’s an old problem and it’s exciting to me that finally we’re starting to actually do things that might lead to some solution. Ultimately we need to fix this problem and I’m beginning to see the hope that it might actually be fixed within my lifetime."
This CO2 solution isn’t yet set in stone. Knowing how the basalt reaction works in the real world is only a small part of the puzzle. Among the big questions are how to minimize the need for large amounts of water as well as the possible seismic and ecological impacts of altering the rocks that make up the very ground beneath us. But many in the field think this kind of research has a lot of promise. Anyone who figures out a safe and reliable way of locking up CO2 emissions could be in a position to make a lot of money as countries start to crack down on greenhouse gas pollution. Siguroardottir says that’s also part of the company’s thinking.
"Of course in the back of our head we look at it as a business opportunity. But let’s see how it works out."
The first tests of Iceland’s experiment are scheduled to begin this weekend.
PRI's "The World" is a one-hour, weekday radio news magazine offering a mix of news, features, interviews, and music from around the globe. "The World" is a co-production of the BBC World Service, PRI and WGBH Boston.
More "The World."
PRI's coverage of social entrepreneurship is supported by the Skoll Foundation.
In Business, Finance & Economics.Tagged: United States Iceland Ashley Ahearn Almar Sigurosson Juerg Matter Gordon Brown Holmfriour Siguroardottir social entrepreneurship | 科技 |
2017-09/1579/en_head.json.gz/26619 | Who Coined 'Cloud Computing'?
Now that every technology company in America seems to be selling cloud computing, we decided to find out where it all began.
Cloud computing is one of the hottest buzzwords in technology. It appears 48 million times on the Internet. But amidst all the chatter, there is one question about cloud computing that has never been answered: Who said it first?
Proof of concept: George Favaloro poses with a 1996 Compaq business plan. The document is the earliest known use of the term “cloud computing” in print (click here to view).
Some accounts trace the birth of the term to 2006, when large companies such as Google and Amazon began using “cloud computing” to describe the new paradigm in which people are increasingly accessing software, computer power, and files over the Web instead of on their desktops.But Technology Review tracked the coinage of the term back a decade earlier, to late 1996, and to an office park outside Houston. At the time, Netscape’s Web browser was the technology to be excited about and the Yankees were playing Atlanta in the World Series. Inside the offices of Compaq Computer, a small group of technology executives was plotting the future of the Internet business and calling it “cloud computing.”Their vision was detailed and prescient. Not only would all business software move to the Web, but what they termed “cloud computing-enabled applications” like consumer file storage would become common. For two men in the room, a Compaq marketing executive named George Favaloro and a young technologist named Sean O’Sullivan, cloud computing would have dramatically different outcomes. For Compaq, it was the start of a $2-billion-a-year business selling servers to Internet providers. For O’Sullivan’s startup venture, it was a step toward disenchantment and insolvency.See the rest of our Business Impact report on Business in the Cloud.Cloud computing still doesn’t appear in the Oxford English Dictionary. But its use is spreading rapidly because it captures a historic shift in the IT industry as more computer memory, processing power, and apps are hosted in remote data centers, or the “cloud.” With billions of dollars of IT spending in play, the term itself has become a disputed prize. In 2008, Dell drew outrage from programmers after attempting to win a trademark on “cloud computing.” Other technology vendors, such as IBM and Oracle, have been accused of “cloud washing,” or misusing the phrase to describe older product lines.Like “Web 2.0,” cloud computing has become a ubiquitous piece of jargon that many tech executives find annoying, but also hard to avoid. “I hated it, but I finally gave in,” says Carl Bass, president and CEO of Autodesk, whose company unveiled a cloud-computing marketing campaign in September. “I didn’t think the term helped explain anything to people who didn’t already know what it is.”The U.S. government has also had trouble with the term. After the country’s former IT czar, Vivek Kundra, pushed agencies to move to cheaper cloud services, procurement officials faced the question of what, exactly, counted as cloud computing. The government asked the National Institutes of Standards and Technology to come up with a definition. Its final draft, released this month, begins by cautioning that “cloud computing can and does mean different things to different people.”“The cloud is a metaphor for the Internet. It’s a rebranding of the Internet,” says Reuven Cohen, cofounder of Cloud Camp, a course for programmers. “That is why there is a raging debate. By virtue of being a metaphor, it’s open to different interpretations.” And, he adds, “it’s worth money.”Part of the debate is who should get credit for inventing the idea. The notion of network-based computing dates to the 1960s, but many believe the first use of “cloud computing” in its modern context occurred on August 9, 2006, when then Google CEO Eric Schmidt introduced the term to an industry conference. “What’s interesting [now] is that there is an emergent new model,” Schmidt said, “I don’t think people have really understood how big this opportunity really is. It starts with the premise that the data services and architecture should be on servers. We call it cloud computing—they should be in a “cloud” somewhere.”The term began to see wider use the following year, after companies including Amazon, Microsoft, and IBM started to tout cloud-computing efforts as well. That was also when it first appeared in newspaper articles, such as a New York Times report from November 15, 2007, that carried the headline “I.B.M. to Push ‘Cloud Computing,’ Using Data From Afar.” It described vague plans for “Internet-based supercomputing.”Sam Johnston, director of cloud and IT services at Equinix, says cloud computing took hold among techies because it described something important. “We now had a common handle for a number of trends that we had been observing, such as the consumerization and commoditization of IT,” he wrote in an e-mail.Johnston says it’s never been clear who coined the term. As an editor of the Wikipedia entry for cloud computing, Johnston keeps a close eye on any attempts at misappropriation. He was first to raise alarms about Dell’s trademark application and this summer he removed a citation from Wikipedia saying a professor at Emory had coined the phrase in the late 1990s. There have been “many attempts to coopt the term, as well as various claims of invention,” says Johnston.That may explain why cloud watchers have generally disregarded or never learned of one unusually early usage—a May 1997 trademark application for “cloud computing” from a now-defunct company called NetCentric. The trademark application was for “educational services” such as “classes and seminars” and was never approved. But the use of the phrase was not coincidental. When Technology Review tracked down NetCentric’s founder, O’Sullivan, he agreed to help dig up paper copies of 15-year-old business plans from NetCentric and Compaq. The documents, written in late 1996, not only extensively use the phrase “cloud computing,” but also describe in accurate terms many of the ideas sweeping the Internet today.
Cloud 1.0: Entrepreneur Sean O’Sullivan filed a trademark on “cloud computing” in 1997. He poses at the offices of NetCentric, in Cambridge, Massachusetts during the late 1990s.
At the time, O’Sullivan’s startup was negotiating a $5 million investment from Compaq, where Favaloro had recently been chosen to lead a new Internet services group. The group was a kind of internal “insurgency,” recalls Favaloro, that aimed to get Compaq into the business of selling servers to Internet service providers, or ISPs, like AOL. NetCentric was a young company developing software that could help make that happen.In their plans, the duo predicted technology trends that would take more than a decade to unfold. Copies of NetCentric’s business plan contain an imaginary bill for “the total e-purchases” of one “George Favaloro,” including $18.50 for 37 minutes of video conferencing and $4.95 for 253 megabytes of Internet storage (as well as $3.95 to view a Mike Tyson fight). Today, file storage and video are among the most used cloud-based applications, according to consultancy CDW. Back then, such services didn’t exist. NetCentric’s software platform was meant to allow ISPs to implement and bill for dozens, and ultimately thousands, of “cloud computing-enabled applications,” according to the plan.Exactly which of the men—Favaloro or O’Sullivan—came up with the term cloud computing remains uncertain. Neither recalls precisely when the phrase was conceived. Hard drives that would hold e-mails and other electronic clues from those precloud days are long gone.Favaloro believes he coined the term. From a storage unit, he dug out a paper copy of a 50-page internal Compaq analysis titled “Internet Solutions Division Strategy for Cloud Computing” dated November 14, 1996. The document accurately predicts that enterprise software would give way to Web-enabled services, and that in the future, “application software is no longer a feature of the hardware—but of the Internet.”O’Sullivan thinks it could have been his idea—after all, why else would he later try to trademark it? He was also a constant presence at Compaq’s Texas headquarters at the time. O’Sullivan located a daily planner, dated October 29, 1996, in which he had jotted down the phrase “Cloud Computing: The Cloud has no Borders” following a meeting with Favaloro that day. That handwritten note and the Compaq business plan, separated by two weeks, are the earliest documented references to the phrase “cloud computing” that Technology Review was able to locate.“There are only two people who could have come up with the term: me, at NetCentric, or George Favaloro, at Compaq … or both of us together, brainstorming,” says O’Sullivan.Both agree that “cloud computing” was born as a marketing term. At the time, telecom networks were already referred to as the cloud; in engineering drawings, a cloud represented the network. What they were hunting for was a slogan to link the fast-developing Internet opportunity to businesses Compaq knew about. “Computing was bedrock for Compaq, but now this messy cloud was happening,” says Favaloro. “And we needed a handle to bring those things together.”Their new marketing term didn’t catch fire, however—and it’s possible others independently coined the term at a later date. Consider the draft version of a January 1997 Compaq press release, announcing its investment in NetCentric, which described the deal as part of “a strategic initiative to provide ‘Cloud Computing’ to businesses.” That phrase was destined to be ages ahead of its time, had not Compaq’s internal PR team objected and changed it to “Internet computing” in the final version of the release.In fact, Compaq eventually dropped the term entirely, along with its plans for Internet software. That didn’t matter to Favaloro. He’d managed to point Compaq (which later merged with HP) toward what became a huge business selling servers to early Internet providers and Web-page hosters, like UUNet. “It’s ridiculous now, but the big realization we had was that there was going to be an explosion of people using servers not on their premises,” says Favaloro. “I went from being a heretic inside Compaq to being treated like a prophet.”For NetCentric, the cloud-computing concept ended in disappointment. O’Sullivan gave up using the term as he struggled to market an Internet fax service—one app the spotty network “cloud” of the day could handle. Eventually, the company went belly up and closed its doors. “We got drawn down a rathole, and we didn’t end up launching a raft of cloud computing apps … that’s something that sticks with me,” says O’Sullivan, who later took a sabbatical from the tech world to attend film school and start a nonprofit to help with the reconstruction of Iraq.Favaloro now heads an environmental consulting firm in Waltham, Massachussetts. What is remarkable, he says, is that the cloud he and O’Sullivan imagined 15 years ago has become a reality. “I now run a 15-person company and, in terms of making us productive, our systems are far better than those of any of big company. We bring up and roll out new apps in a matter of hours. If we like them, we keep them, if not, we abandon them. We self-administer, everything meshes, we have access everywhere, it’s safe, it’s got great uptime, it’s all backed up, and our costs are tiny,” says Favaloro. “The vision came true.”
Subscribe today Credit
Antonio Regalado; NetCentric
Antonio Regalado
Senior Editor, Biomedicine
I am the senior editor for biomedicine for MIT Technology Review. I look for stories about how technology is changing medicine and biomedical research. Before joining MIT Technology Review in July 2011, I lived in São Paulo, Brazil,… More where I wrote about science, technology, and politics in Latin America for Science and other publications. From 2000 to 2009, I was the science reporter at the Wall Street Journal and later a foreign correspondent.
Business in the Cloud
Treating computing as a utility, like electricity, is an old idea. But now it makes financial sense—a historic shift that explains why cloud computing is reshaping the economics of IT. Even startup companies and consumers now can access massive amounts of computing power. The cloud is also raising new questions about privacy and security.
The Cloud Imperative
Treating computing as a utility, like electricity, is an old idea. But now it makes financial sense—a historic shift that is reshaping the IT industry.
Cloud Computing Defined
A primer on key terms in Business Impact this month.
Being Smart about Cloud Security
An authority on Web security believes your data might be safer in the cloud.
by Brian Bergstein
The Battle for the Government
As governments all over the world move their IT to the cloud, they are becoming some of Google and Microsoft’s most coveted customers.
by Lee Gomes
Facebook Shares Its Cloud Designs
Cloud hardware could get cheaper because of the social network’s self-interested altruism. | 科技 |
2017-09/1579/en_head.json.gz/26650 | Forget the Quantified Self. We Need to Build the Quantified Us subscribe
Author: Matthew Jordan and Nikki Pfarr.
Matthew Jordan and Nikki Pfarr Design Date of Publication: 04.04.14.
Forget the Quantified Self. We Need to Build the Quantified Us
Modwells, a concept designed by Artefact, where the two authors work. The idea is to create sensors that can be embedded in clothing which gather data that can be viewed by both patient and doctor. Image: Artefact The ‘Quantified Self’ is a thrilling prospect for some: Massive datasets about oneself can be a new route to self-discovery. But for most of us, the idea of continuous self-tracking is a novelty that results in shallow insights. Just ask anyone who has bought a Fitbit or Jawbone Up which now lies dusty at the bottom of a junk drawer. For the Quantified Self movement to become truly useful, our gadgets will have to move beyond the novelty of gratuitous behavioral data, which we might call a ‘first degree of meaning.’ They’ll have to address a second degree of meaning, where self-tracking helps motivate people toward self-improvement, and a third degree of meaning, where people can use data to make better choices in the moments when a decision is actually being made. We’re moving closer to those goals, but we’re still not thinking rigorously about the challenges involved. So let’s start.
Matthew Jordan About
JMatthew Jordan, Artefact’s research director, has worked with companies like Baxter Healthcare, St. Jude Medical, and Mayo Clinic to apply the design process to the health industry. Two recent Artefact projects he led are Juice Box energy system and Dialog, a concept for people with epilepsy. He can be reached at health@artefactgroup.com.
It so happens that the rise of the quantified self coincides with the rise of Big Data, which has become a buzzword rapidly adopted in targeted marketing campaigns and recommendation engines that push products. But in between Big Data and Small Data, between the Quantified Self and the crowd lies a third way: what we at Artefact like to call the Quantified Us.
Imagine a future where self-tracking harnesses a whole population’s data to identify patterns and make meaningful recommendations. Imagine a future where we can see into the data of people just like us, to help us live better, and where we willingly give up a bit of privacy in exchange for vast benefits.
The Quantified Us
The Quantified Us should be based on a select group of people who share similar goals, health conditions, or even similarity of emerging data patterns. They could be your friends, but they’re more likely strangers who happen to have a lot in common with you. We are already starting to see the beginnings of a Quantified Us movement starting to emerge, though we feel its full potential is untapped: –PatientsLikeMe allows people to share personal health records so they can compare ‘treatments, symptoms, and experiences.’ The site also supports personal connections with the community, as well as the ability to track your own health data and to make your records available to medical researchers. These data, however, are positioned as a tool for the medical community to review and gain clinical insights. –Crohnology is a social network centered on people who suffer from Crohn’s disease and colitis. The community revolves around the sharing and aggregation of information. But the scope and depth of data that the patient can access is limited, and, as a result, so are the insights. –StockTwits uses a followers model, connecting investors who are interested in the same financial opportunities. Though the insights can be very timely and represent the sentiment of an informed group, the ‘group’ is just defined by who decides to follow who. There is no collaboration, because because no one is sharing their personal data.
Dialog, another concept from Artefact. This one is for people with epilepsy: It would warn of oncoming seizures and track environmental triggers, while also serving patient data to doctors. Image: Artefact
While these early, partial examples of the Quantified Us are headed in the right direction, they still make users manually share their data and pan for insights. The Quantified Us should instead tackle the challenge of helping these groups form, facilitating data collection, extracting insights tailored to individual action. To get to this future, we must be clear about what the Quantified Us is and how it achieves success. A successful Quantified Us strategy is:
—Selective, But Configurable: People must be able to control the boundaries of how their data is shared, and the sample sets to which they’re compared. For example, if a woman experiences migraines brought on by caffeine, she—not anyone else—should be able to exclude people with unrelated triggers. For that to work, designers have to create user experiences that make clear the boundaries between groups and the individual user.
—Driven by Democracy: The Quantified Us hinges upon people making a choice to trade their personal data for access to broader swathes of information. That’s a grass-roots type movement, and design’s role should be to foster a sense of community and transparency. —Focused on Individual Understanding and Decisions: The Quantified Us means nothing unless individuals can extract insights, make better decisions, and change their behaviors. Therefore, a well-designed platform should make decision making the core of its user experience. What patients would see and could share with doctors with the Dialog concept. Image: Artefact
Imagine a person with epilepsy trying to understand an uptick in seizures. What if he could compare his triggers to those of people just like him? Such a user experience could address everything from Crohn’s disease to migraines. These need not be separate products: Indeed, they could be similar user experiences, tailored to individual use cases. Now imagine a person with insulin-dependent diabetes whose blood sugars are running high at night, but who isn’t able or doesn’t feel motivated to understand why. What if she could see the profiles and data of other people like her, and see where she falls relative to the “norm”? What if she was able to start a dialog with other people like her, or to get emotional support when she needs it?
It’s easy to imagine a variety of scenarios in which self-tracking combined with collective data sharing can result in deeper understanding and heightened motivation. Ultimately the Quantified Us can help people take better care of themselves, more often—and feel more connected to each other in the process. | 科技 |
2017-09/1580/en_head.json.gz/90 | Home / December SOPA Update: GoDaddy.com December SOPA Update: GoDaddy.com
Posted by: tushar nene December 27, 2011 in Uncategorized
Please Share...000000Earlier this month we took a look at the Stop Online Privacy Act (SOPA) as it made its way through hearings in the House Judiciary Committee, through amendments, strong objections and ultimately a question on whether or not those folks in the room were even qualified to make any rational and informed decision on the topic. Eventually the proceedings were postponed and will pick up again when the House reconvenes after the holidays, but that doesn’t mean that December has to be devoid of all SOPA news, does it? Politics aside, there was still a fair amount of SOPA news in the last two weeks or so, the majority of it revolving around one of SOPA’s public supporters, domain name registrar GoDaddy.com.
While many other internet companies lined up to publicly oppose SOPA as a death sentence to the free web, GoDaddy supported the bill and other related legislation like Protect IP as a viable method for policing piracy on the internet. They went so far as to publish and op-ed piece on Politico shortly after the bill was introduced praising the bill, as well as providing written testimony to the House Judiciary Committee in support. It seemed strange really, as they were the only internet company named in the Committee’s list of corporate SOPA supporters, in a field of entertainment media production companies (Disney, etc.) and organizations that represent entertainment media and related special interests groups like the RIAA and MPAA.
This of course irked the ire of some of their customers, culminating in a Reddit-fueled boycott of GoDaddy by poster selfprodigy, who planned on moving all of their 51 domains away from GoDaddy’s services. As of right now the post has over 3,000 comments and a Reddit score of 4,409 points with more and more people voicing their opinions on the matter. While GoDaddy pretty much ignored the boycott as a nuisance to start, bigger threats from bigger customers like Ben Huh of the Cheezburger websites started to come in (with his 1,000 GoDaddy registered domains), and GoDaddy turned an about face, stating in a news release that they would no longer support SOPA. But was that public reversal of policy nothing more than a parlor trick to woo customers back and keep the ones they still had? Their support for SOPA cost them about 37,000 domains and it looks to me that the only reason they “reversed” their position was an increasing loss in revenue streams. An interview with GoDaddy CEO Warren Adelman by TechCrunch’s Devin Coldewey also shows how this change of heart might not really be for real:
“Adelman couldn’t commit to changing its position on the record in Congress when asked about that, but said “I’ll take that back to our legislative guys, but I agree that’s an important step.” But when pressed, he said “We’re going to step back and let others take leadership roles.” He felt that the public statement removing their support would be sufficient for now, though further steps would be considered.”
“Sufficient for now.” It’s pretty clear that GoDaddy hasn’t changed their position, but instead have publicly run to the middle with Swiss-like neutrality, which only further tells me that “We don’t support SOPA” doesn’t translate into much more than “We don’t support losing customers and their cash.” Adelman goes on to say that he will support SOPA when the internet community does and that there has to be “consensus about the leadership of the internet community.” Leadership of the internet community? That’s just the point, no one owns the internet, and this statement further shows how out of touch GoDaddy is with reality and the internet community they claim to serve. Having dealt with GoDaddy before, and reading other pre-SOPA stories of how they operate, it’s just not that surprising. Other pro-open internet registrars like Dreamhost, NetGator and Namecheap are taking this as an opportunity to take some of GoDaddy’s customers through SOPA coupon codes like “NOSOPA” and SOPASucks.” Namecheap is even running an offer through December 29th in which they will donate $1 to the Electronic Frontier Foundation for each domain transfer from GoDaddy. NameCheap CEO Richard Kirkendall had the following to say on SOPA:
“While we at Namecheap firmly believe in intellectual rights, SOPA is like detonating a nuclear bomb on the internet when only a surgical strike is necessary. This legislation has the potential to harm the way everyone uses the Internet and to undermine the system itself. At Namecheap, we believe having a free and open Internet is the only option that will continue the legacy of innovation and openess that stands for everything we all value in our modern society.”
GoDaddy really shot themselves in the foot here. This series of moves is going to lose them a lot of business. But if you’re the “silver lining” type, the GoDaddy mass exodus could be ammunition against SOPA supporters in Congress as a “here’s what we think” sort of statement. We’ll see. If you’re looking for another domain name registrar, Lifehacker has a list of some decent ones that are not pro-SOPA.
And about that “leadership of the internet” thing, I’ll throw my hat in the ring for “Internet Elder.”
GoDaddy Internet SOPA 2011-12-27
tushar nene
Tagged with: GoDaddy Internet SOPA
About tushar nene Previous: Book Review: The Strange Case of Dr. Jekyll & Mr. Hyde by Robert Louis Stevenson
Next: Book Review: Henry and June by Anaïs Nin | 科技 |
2017-09/1580/en_head.json.gz/132 | Posted February 2, 2013 04:11 pm Associated Press Inventor of Etch A Sketch dies
BRYAN, Ohio — Andre Cassagnes, the inventor of the Etch A Sketch toy that generations of children drew on, shook up and started over, has died in France, the toy’s maker said.Cassagnes died Jan. 16 in a Paris suburb at age 86, said the Ohio Art Co., based in Bryan, Ohio. The cause wasn’t disclosed Saturday.“Etch A Sketch has brought much success to the Ohio Art Co., and we will be eternally grateful to Andre for that. His invention brought joy to so many over such a long period of time,” said Larry Killgallon, the company’s president.Then an electrical technician, Cassagnes came upon the Etch A Sketch idea in the late 1950s when he peeled a translucent decal from a light switch plate and found pencil mark images transferred to the opposite face, the Toy Industry Association said.Ohio Art saw his idea at the Nuremberg Toy Fair in 1959. The toy, with its gray screen, red frame and two white knobs that are twisted back and forth to create drawings, was launched in 1960 and became the top seller that holiday season. More than 100 million have been sold worldwide since.Though passed over in popularity for video games and gadgets, the toy has a steady market, the company has said. It got a big jump in sales after Etch A Sketch was featured in the first two Toy Story movies.Ohio Art also capitalized on a much-publicized gaffe by a Mitt Romney aide during last year’s presidential election, who was asked about his candidate’s views during the primary season versus the general election. He likened the campaign to an Etch A Sketch: “You can kind of shake it up and we start all over again.”Democrats and Republicans alike seized on the remark as evidence that Romney was willing to change his positions for political gain. Ohio Art seized on the publicity, creating a politically themed ad campaign and manufacturing blue versions of the famously red toy.Etch A Sketches were made in Ohio until 2000, when the company moved production to China because of increasing costs. Advertisement
Associated Press Italy earthquake kills dozens AMATRICE, Italy — A strong earthquake in central Italy reduced three towns to rubble as people slept early Wednesday, with... Read more George Jahn Secret document aids Iran nukes, AP says Danica Kirka and Jill Lawless Britain votes to leave EU; leader to quit; markets rocked LONDON — Britain voted to leave the European Union after a bitterly divisive referendum campaign, toppling the prime minister Friday,... Read more Danica Kirka and Jill Lawless UK pound plunges as referendum results point to EU exit Load more | 科技 |
2017-09/1580/en_head.json.gz/279 | AT&T 4G LTE Available In Louisville
Customers to benefit from ultra-fast mobile Internet on the latest LTE devices
LOUISVILLE, Ky., Nov. 14, 2012 /PRNewswire/ -- AT&T* has turned on its 4G LTE network in Louisville, bringing customers the latest generation of wireless network technology. Watch here to see several of the benefits AT&T 4G LTE provides, including: Faster speeds. LTE technology is capable of delivering mobile Internet speeds up to 10 times faster than 3G. Customers can stream, download, upload and game faster than ever before. Cool new devices. AT&T offers several LTE-compatible devices, including new AT&T 4G LTE smartphones and tablets, such as the Sony Xperia™ TL, LG Optimus G™, Samsung Galaxy S III, Motorola ATRIX ™ HD, HTC One™ X, Nokia Lumia 900, Samsung Galaxy Note™, and Pantech Element™ tablet. Faster response time. LTE technology offers lower latency, or the processing time it takes to move data through a network, such as how long it takes to start downloading a webpage or file once you've sent the request. Lower latency helps to improve services like mobile gaming, two-way video calling and telemedicine. More efficient use of spectrum. Wireless spectrum is a finite resource, and LTE uses spectrum more efficiently than other technologies, creating more space to carry data traffic and services and to deliver a better network experience. (Logo: http://photos.prnewswire.com/prnh/20120612/DA23287LOGO) AT&T invested more than $600 million in its Kentucky wireless and wireline networks from 2009 through 2011 with a focus on improving the company's mobile Internet coverage and overall performance of its networks.AT&T Kentucky State President Mary Pat Regan said her company's local investment creates many advantages for those in Louisville. "This 4G LTE launch is great news for Kentucky and is further evidence that when our elected leaders create an environment that favors investment, consumers benefit. The investment we've made to our AT&T wireline and wireless networks alone has equated to more than $600 million over the last three years, so bringing 4G LTE here is the latest example of this significant infrastructure investment," said Regan. "We continue to see demand for mobile Internet skyrocket, and our 4G LTE network in Louisville responds to what customers want from their mobile experience — more, faster, on the best devices," said Chris Percy, vice president and general manager of Mobility and Consumer Markets for AT&T Kentucky, Tennessee and southern Indiana. AT&T's 4G Network AT&T's innovation and investment has resulted in the nation's largest 4G network, covering more than 285 million people with ultra-fast speeds and a more consistent user experience. That's coverage in 3,000 more 4G cities and towns than Verizon. Our 4G LTE network delivered faster average download speeds than any of our competitors in PCWorld's most recent 13-market speed tests.AT&T's 4G LTE network delivers up to 10 times faster than 3G** as well as multiple innovations that optimize the network for performance. Our network's radio components are placed close to the antenna at most cell sites, instead of inside the base station, which helps minimize power loss between the base station and antenna and, in turn, improves the performance of our 4G LTE network. The network also is designed with its core elements distributed across the country, which helps reduce latency, or the delay when using the Internet, because your request isn't traveling as far.Even as AT&T continues to expand its 4G LTE coverage in 2012 and 2013, customers can get 4G speeds outside of 4G LTE areas on our 4G HSPA+ network, unlike competitors, where smartphone customers fall back to slower 3G technologies when outside of LTE coverage. AT&T's focus to deliver the best possible mobile Internet experience goes beyond 4G to embrace additional connection technologies. AT&T operates the nation's largest Wi-Fi network*** including more than 31,000 AT&T Wi-Fi Hot Spots at popular restaurants, hotels, bookstores and retailers across the country. Most AT&T smartphone customers get access to our entire national Wi-Fi network at no additional cost, and Wi-Fi usage doesn't count against customers' monthly wireless data plans.
AT&T also is a leading developer of Distributed Antenna Systems, which utilize multiple small antennas to maximize coverage and speed within stadiums, convention centers, office buildings, hotels and other areas where traditional coverage methods are challenging. Over the past five years, AT&T invested more than $115 billion into operations and into acquiring spectrum and other assets that have enhanced our wireless and wired networks. Since 2007, AT&T has invested more capital into the U.S. economy than any other public company. In a July 2012 report, the Progressive Policy Institute ranked AT&T No. 1 on its list of U.S. "Investment Heroes."*AT&T products and services are provided or offered by subsidiaries and affiliates of AT&T Inc. under the AT&T brand and not by AT&T Inc.**Limited 4G LTE availability in select markets. Deployment ongoing. 4G LTE device and data plan required. Up to 10x claim compares 4G LTE download speeds to industry average 3G download speeds. LTE is a trademark of ETSI. 4G speeds not available everywhere. Learn more about 4G LTE at att.com/network.***Access includes AT&T Wi-Fi Basic. A Wi-Fi enabled device required. Other restrictions apply. See www.attwifi.com for details and locations.About AT&TAT&T Inc. (NYSE:T) is a premier communications holding company and one of the most honored companies in the world. Its subsidiaries and affiliates – AT&T operating companies – are the providers of AT&T services in the United States and internationally. With a powerful array of network resources that includes the nation's largest 4G network, AT&T is a leading provider of wireless, Wi-Fi, high speed Internet, voice and cloud-based services. A leader in mobile Internet, AT&T also offers the best wireless coverage worldwide of any U.S. carrier, offering the most wireless phones that work in the most countries. It also offers advanced TV services under the AT&T U-verse® and AT&T │DIRECTV brands. The company's suite of IP-based business communications services is one of the most advanced in the world. Additional information about AT&T Inc. and the products and services provided by AT&T subsidiaries and affiliates is available at http://www.att.com. This AT&T news release and other announcements are available at http://www.att.com/newsroom and as part of an RSS feed at www.att.com/rss. Or follow our news on Twitter at @ATT.© 2012 AT&T Intellectual Property. All rights reserved. 4G not available everywhere. AT&T, the AT&T logo and all other marks contained herein are trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.Cautionary Language Concerning Forward-Looking StatementsInformation set forth in this press release contains financial estimates and other forward-looking statements that are subject to risks and uncertainties, and actual results might differ materially. A discussion of factors that may affect future results is contained in AT&T's filings with the Securities and Exchange Commission. AT&T disclaims any obligation to update and revise statements contained in this news release based on new information or otherwise. SOURCE AT&T Inc.
Published Nov. 14, 2012— Reads 1,240 Copyright © 2012 SYS-CON Media, Inc. — All Rights Reserved. | 科技 |
2017-09/1580/en_head.json.gz/295 | Search Copenhagen's political science
Posted: Friday, December 11, 2009 By Sarah Palin
With the publication of damaging e-mails from a climate research center in Britain, the radical environmental movement appears to face a tipping point. The revelation of appalling actions by so-called climate change experts allows the American public to finally understand the concerns so many of us have articulated on this issue.
"Climate-gate," as the e-mails and other documents from the Climate Research Unit at the University of East Anglia have become known, exposes a highly politicized scientific circle - the same circle whose work underlies efforts at the Copenhagen climate change conference. The agenda-driven policies being pushed in Copenhagen won't change the weather, but they would change our economy for the worse.
The e-mails reveal that leading climate "experts" deliberately destroyed records, manipulated data to "hide the decline" in global temperatures, and tried to silence their critics by preventing them from publishing in peer-reviewed journals. What's more, the documents show that there was no real consensus even within the CRU crowd. Some scientists had strong doubts about the accuracy of estimates of temperatures from centuries ago, estimates used to back claims that more recent temperatures are rising at an alarming rate.
This scandal obviously calls into question the proposals being pushed in Copenhagen. I've always believed that policy should be based on sound science, not politics. As governor of Alaska, I took a stand against politicized science when I sued the federal government over its decision to list the polar bear as an endangered species despite the fact that the polar bear population had more than doubled. I got clobbered for my actions by radical environmentalists nationwide, but I stood by my view that adding a healthy species to the endangered list under the guise of "climate change impacts" was an abuse of the Endangered Species Act. This would have irreversibly hurt both Alaska's economy and the nation's, while also reducing opportunities for responsible development.
Our representatives in Copenhagen should remember that good environmental policy-making is about weighing real-world costs and benefits - not pursuing a political agenda. That's not to say I deny the reality of some changes in climate - far from it. I saw the impact of changing weather patterns firsthand while serving as governor of our only Arctic state. I was one of the first governors to create a subcabinet to deal specifically with the issue and to recommend common-sense policies to respond to the coastal erosion, thawing permafrost and retreating sea ice that affect Alaska's communities and infrastructure.
But while we recognize the occurrence of these natural, cyclical environmental trends, we can't say with assurance that man's activities cause weather changes. We can say, however, that any potential benefits of proposed emissions reduction policies are far outweighed by their economic costs. And those costs are real. Unlike the proposals China and India offered prior to Copenhagen - which actually allow them to increase their emissions - President Obama's proposal calls for serious cuts in our own long-term carbon emissions. Meeting such targets would require Congress to pass its cap-and-tax plans, which will result in job losses and higher energy costs (as Obama admitted during the campaign). That's not exactly what most Americans are hoping for these days. And as public opposition continues to stall Congress' cap-and-tax legislation, Environmental Protection Agency bureaucrats plan to regulate carbon emissions themselves, doing an end run around the American people.
In fact, we're not the only nation whose people are questioning climate change schemes. In the European Union, energy prices skyrocketed after it began a cap-and-tax program. Meanwhile, Australia's Parliament recently defeated a cap-and-tax bill. Surely other nations will follow suit, particularly as the climate e-mail scandal continues to unfold.
In his inaugural address, President Obama declared his intention to "restore science to its rightful place." But instead of staying home from Copenhagen and sending a message that the United States will not be a party to fraudulent scientific practices, the president has upped the ante. He plans to fly in at the climax of the conference in hopes of sealing a "deal." Whatever deal he gets, it will be no deal for the American people. What Obama really hopes to bring home from Copenhagen is more pressure to pass the Democrats' cap-and-tax proposal. This is a political move. The last thing America needs is misguided legislation that will raise taxes and cost jobs - particularly when the push for such legislation rests on agenda-driven science.
Without trustworthy science and with so much at stake, Americans should be wary about what comes out of this politicized conference. The president should boycott Copenhagen.
Sarah Palin was the 2008 Republican nominee for vice president and governor of Alaska from 2006 to 2009.
VICE PRESIDENT AND GOVERNOR INDIA DEMOCRATS UNITED STATES CHINA OBAMA HIGHER ENERGY COSTS CONGRESS AMERICA EUROPEAN UNION ALASKA ENVIRONMENTAL ISSUE GOVERNOR ENERGY PRICES ENVIRONMENTAL PROTECTION AGENCY SARAH PALIN PRESIDENT CLIMATE RESEARCH UNIT UNIVERSITY OF EAST ANGLIA COPENHAGEN Trending this week: | 科技 |
2017-09/1580/en_head.json.gz/347 | How Hovercraft Technology Could Help People in the Most Remote Parts of the World
The innovative technology behind this cargo-hauling ship of the skies allows it to travel virtually anywhere. What is a hovercraft? It’s part boat, part airplane and part helicopter. While it may look a bit odd, how it works is quite simple. It’s an amphibious vehicle that travels on a cushion of air created by a downward blast. Hovercraft use air to balance their weight, allowing the craft to operate efficiently. Hovercraft can also operate across nearly any terrain, including land, water and ice. It’s this characteristic that makes hovercraft technology the ideal match for the Hybrid Airship, the cargo-hauling ship of the skies, enabling the airship to access remote locations around the world. ACLS: WHERE HOVERCRAFT TECHNOLOGY MEETS THE HYBRID AIRSHIP
From carrying heavy equipment to isolated regions of Alaska, to serving as a flying clinic for disaster-relief efforts, there is almost no cargo mission this ship can’t perform. This is due to the fact that the airship can land nearly anywhere. What gives the Hybrid Airship this capability? The air cushion landing system (ACLS).
The ACLS looks like giant inflatable doughnuts on the bottom of a large blimp, and it makes the challenge of accessing remote regions around the globe a thing of the past. Lockheed Martin Skunk Works® developed an ACLS that blends hovercraft technology with our airship design.
“One of the biggest challenges to traditional cargo airship operations is how and where you park the airship,” said hybrid design program manager, Dr. Bob Boyd. “It’s very expensive and time consuming to develop infrastructure in remote areas around the world. The ACLS allows the Hybrid Airship to access these isolated regions without needing to build any runway or roads.” The ACLS system consists of three underbody hoverpads. These hoverpads create a cushion of air that allows the airship to float along the ground nearly friction free. The system gives the Hybrid Airship a unique capability to hover over water – a capability unmatched by any other cargo-hauling air vehicle.
"The ACLS looks like giant inflatable doughnuts on the bottom of a large blimp, and it makes the challenge of accessing remote regions around the globe a thing of the past." If the Hybrid Airship needs to park on land, that’s not a problem either. As the airship taxis, the hoverpads ‘grip’ to the ground with light suction pressure to keep the airship from moving in variable winds.
‘Fingers’ hang below the pads to create a seal with the ground. These fingers allow the airship to taxi over obstacles, such as tree stumps or rocks, so extensive site preparation is not needed for a high volume cargo operation. These ‘fingers’ also make the hovercraft a sustainable solution because there is no long-term impact to the ground site.
With this innovative system, the Hybrid Airship is able to travel virtually anywhere and affordably stay there. OPENING A WORLD OF POSSIBILITIES More than two-thirds of the world’s land and more than half the world’s population have no direct access to paved roads. This lack of infrastructure presents challenges to accessing these isolated regions.
Imagine a world where virtually any place can be reached. Medical equipment, food and aid workers can be sent to aid disaster-relief efforts, and essential health outreach programs in developing nations around the world can constantly receive the resources they need.
The ACLS makes this idea a reality. The Hybrid Airship uses a combination of technologies to allow us to provide service in the most remote areas, opening a new world of possibilities.
ENGINEER SPOTLIGHT: MEET A SKUNK
For more than 70 years, Skunk Works has created revolutionary aircraft and technologies that push the boundaries of what is possible. What has been key to Skunk Works’ success? A group of “Skunks” – some of the most innovative, strategic and visionary thinkers around.
Cheryl Limas, software engineer, is one of these brave “Skunks”. As part of the Lockheed Martin Skunk Works team, Limas aids the development of software centric systems, such as the one-of-a-kind autonomous robot, SPIDER, that locates and patches tiny holes found in the Hybrid Airship’s envelope.
Her enthusiasm for supporting our service members brought Limas to Lockheed Martin. With Skunk Works’ legacy of technological achievement, Limas knew she wanted to be part of the team advancing the state of flight and technology.
“During my time at Lockheed Martin, I have been able to meet a variety of intelligent professionals that have helped aid my development as a young female engineer,” shared Limas. “I am fortunate to be mentored by, learn from, and work alongside this group of experienced and highly talented individuals.”
While there’s no sole recipe for a career at Skunk Works, Limas recommends taking advantage of every opportunity that is presented. You never know what experience might set you apart and help you in the future.
VIEW SIMILAR STORIES | 科技 |
2017-09/1580/en_head.json.gz/348 | Four Fantastic Materials Science Principles in Action
Materials science is at the core of many superhero powers, including those exhibited by Marvel Comics’ Fantastic Four—elasticity, invisibility, super strength and thermodynamics.
While becoming invisible or changing your shape may seem like science fiction, the principles driving these abilities are very real—and, in fact, are already being applied by scientists every day.
As research into materials science continues to expand, industries around the world will benefit from an entire new set of solutions available at their fingertips—whether it is developing more efficient power sources, redesigning airplanes or traveling even deeper into space.
Read on to learn more about how materials science developments today may change the world of tomorrow. ELASTICITY (MISTER FANTASTIC)
In materials design, elasticity could refer to materials that are self-healing or reconfigurable.
Without human intervention, self-healing materials can repair damage by naturally reforming chemical bonds or using bacteria. In fact, such materials are already being used for applications like self-healing concrete or in the future, as anti-corrosive paint for Navy ships.
Reconfigurable materials, on the other hand, can change their properties under different conditions. At the microscopic scale, individual molecule bonds can reversibly change shape when absorbing and emitting energy. This translates to macroscopic shape change for polymer materials—for example, a polymer that curls or folds into itself when placed under light or electric charge.
“In the real world, we could imagine reconfigurable elements to be designed into planes or cars. Today, plane wing shapes are fixed, but the ideal wing shape is different during different phases of flight—taxi, takeoff, landing and so on,” said materials scientist Anna Paulson. “If designed with reconfigurable materials, these shapes could be optimized during flight to improve fuel efficiency.”
While a material that can turn a superhero into a parachute or trampoline is pretty farfetched, NASA is already exploring the use of flexible airplane wings, which could benefit from this type of research. INVISIBILITY (INVISIBLE WOMAN)
Making an object appear invisible is really a matter of addressing patterns and light.
Invisible materials are patterned in a certain way, with conducting and insulating elements that can direct electromagnetic radiation around an object.
In rendering an object invisible, there are three big challenges: altering the size of these patterns, controlling light in three dimensions and designing a pattern for multiple wavelengths.
“Overcoming these challenges is physically possible, and already, patterns have been simulated with the necessary properties,” said Paulson. “Today, researchers are developing technology to fabricate three-dimensional nanoscale patterns that enable us to control light in three dimensions.”
While being an invisible superhero obviously has its appeal, invisibility would come in handy for aesthetic purposes in our everyday lives.
Imagine using building materials with invisible properties for power lines or as guardrails on top of the Empire State Building. Other applications for materials that bend light are in optical processors for faster computers and in antenna materials for higher power antennas.
SUPER STRENGTH (THE THING)
To achieve super strength, you have to take the science principles down to a molecular level.
Nanotechnology is the manipulation of matter at the nanoscale, which is between one and 100 nanometers, or one millionth of a millimeter. Here, you can alter individual atoms and molecules to change the physical, chemical, biological and optical properties.
As one of the best raw materials for nanotechnology, carbon provides a structure for graphene. In its purest form, graphene is a single atomic layer of carbon atoms, bonded so tightly together that they are impermeable to nearly everything—making the material both unbelievably strong and highly tolerant of harsh chemicals and wide ranges of temperatures and pressures.
While the material is currently being researched for use in everything from consumer electronics displays to medical devices, the possibility of perforating a sheet of graphene could also lead to new solutions for major global challenges like clean drinking water and energy management. Nanotechnology has also led to the development of carbon nanotubes, which are incredibly small and incredibly strong—100 times stronger than steel and 10,000 times smaller than a single human hair.
“What makes carbon nanotubes so strong is their carbon atoms, which are configured and bonded to one another with the strongest chemical bonds available to them,” said Mitchell Meinhold, Lockheed Martin materials scientist.
Today, carbon nanostructure fibers are being used in structures like the Juno spacecraft. In the future, given their energy efficient properties, carbon nanotubes could be used for long-lifetime lithium batteries, terabyte flash memory, smart phone chemical sensors, wiring for electronics that is woven into clothing and strong lightweight composites for consumer products.
However, a big limitation to producing nanotube-infused structures on a large scale is the ability to grow them to extremely long lengths. Nanotubes are currently grown in a lab—a miniature carbon nanotube forest of sorts—with the nanotubes reaching just a few centimeters in length.
“With meter-long nanotubes, you could imagine them being designed into something like lightweight cars,” said Paulson. “While the nanotubes are only strong in one direction, assembling them in multiple directions would allow the vehicle to resist impact.”
Scientists are already researching the use of multiple nanotubes stranded together to produce an incredibly strong, lightweight fabric. A real-life the Thing suit, anyone?
THERMODYNAMICS (HUMAN TORCH)
The ability for a material to withstand extremely high temperatures boils down to chemical bonds.
In general, stiffer and harder materials melt at higher temperatures. To serve as a protective barrier, the material must also be a poor conductor of heat.
For a vehicle (or superhero) to travel at hypersonic speeds—Mach 5 and above—it must be designed with these extremely durable materials capable of withstanding temperatures in excess of 2,200 degrees Fahrenheit. With such heat-resistant materials, we could design spacecraft to travel even deeper into space or explore extremely hot places—like the surface of the sun.
“Of course, the exploration of places with very high temperatures would still be challenging because they are also areas of high pressure and radiation,” said Mike Stock, Lockheed Martin thermodynamicist. “However, we can design and develop advanced propulsion systems to take a spacecraft to the stars while avoiding the hazardous areas in space.”
Though a superhero like the Human Torch can envelop their body in flames, for the everyday person, heat-resistant materials could be useful in the areas of safety and fire protection. Very high temperature materials could also enable the construction of extremely efficient engines that would cut fuel consumption in half. Join our team | 科技 |
2017-09/1580/en_head.json.gz/393 | Walmart Loses Head Of Digital Media To Apple
By Paul Glazowski2008-02-02 09:05:57 UTC
Wal-Mart has lost the head of its digital media department. Kevin Swint, once the champion of the retail giant’s music and video services, has left one large empire to join another, according to a report by Rafat Ali of PaidContent.org.
Swint is starting the first full week of February as the international head of movies and television at Apple, Inc, where he will be tasked with managing the company’s burgeoning digital video channels. Apple has made clear its intentions to build the reach of its visual content offerings in line with the scope of its successful music distribution efforts. Swint’s new employer has evidently chosen him to pilot the video ship as it expands eastward across the globe.
Chalk another reduction up for Wal-Mart’s media download solutions. Though the company will of course manage to sign on a replacement for Swint, the fact of the matter is that Wal-Mart has remained mostly stagnant - in the case of video distribution, it has charted significant losses, all of which eventually led to the closing of its online movie service - as Apple has consistently pushed forward. It’s indeed logical for Swint to transfer to the hare rather than remain with the laggardly tortoise. (Relatively speaking.) Visible success, however young and unprecedented, is always more enticing than inactivity.
What this loss means for Wal-Mart for the near future is anyone’s guess. My impression of the news is that Wal-Mart is very much in a struggle with unplanned and undesired change, and it will not make amendments as quickly as some might wish it to. And that is quite an unfortunate prognosis to make for the top retail heavyweight in the world.
Video, web 2.0 | 科技 |
2017-09/1580/en_head.json.gz/427 | EDITORS' BLOG Last Updated: Wednesday, 19 March 2008, 18:23 GMT
Methane found on distant world
By Helen Briggs
Science reporter, BBC News
The planet is a "hot Jupiter" blasted by starlight
A carbon-containing molecule has been detected for the first time on a planet outside our Solar System.
The organic compound methane was found in the atmosphere of a planet orbiting a star some 63 light years away.
Water has also been found in its atmosphere, but scientists say the planet is far too hot to support life.
The discovery, unveiled in the journal Nature, is an important step towards exploring new worlds that might be more hospitable to life, they say.
Methane, made up of carbon and hydrogen, is the simplest possible organic compound. HD 189733b
Located 63 light years from Earth, in the constellation Vulpecula, the little fox
About the size of Jupiter but orbits closer to the parent star in its Solar System than Mercury does in our own
Temperatures reach 900 degrees C, about the melting point of silver
Under certain circumstances, methane can play a key role in prebiotic chemistry - the chemical reactions considered necessary to form life.
Scientists detected the gas in the atmosphere of a Jupiter-sized planet known as HD 189733b.
Co-author Giovanna Tinetti from University College, London, told BBC News: "This planet is a gas giant very similar to our own Jupiter, but orbiting very close to its star.
"The methane here, although we can call it an organic constituent, is not produced by life - it is way too hot there for life."
Stepping stone
Dr Tinetti, and co-authors Mark Swain and Gautam Vasisht, from Nasa's Jet Propulsion Laboratory in Pasadena, California, found the tell-tale signature of methane in the planet's atmosphere using the Hubble Space Telescope.
The observations were made as the planet passed in front of its parent star, as viewed from Earth. As the star's light passed briefly through the planet's atmosphere, the gases imprinted their chemical signatures on the transmitted light.
My personal view is it is way too arrogant to think that we are the only ones living in the Universe
Dr Giovanna Tinetti
A method known as spectroscopy, which splits light into its components, revealed the chemical "fingerprint" of methane. The researchers also confirmed a previous discovery - made by Nasa's Spitzer Space Telescope - that the atmosphere of HD 189733b also contains water vapour.
It shows that Hubble, Spitzer and a new generation of space telescopes yet to be launched can detect organic molecules on other extrasolar planets using spectroscopy, they say.
Dr Swain said: "This is a crucial stepping stone to eventually characterising prebiotic molecules on planets where life could exist."
Dr Tinetti said the technique could eventually be applied to extrasolar planets that appear more suitable for life than HD 189733b.
She said: "I definitely think that life is out there. My personal view is it is way too arrogant to think that we are the only ones living in the Universe."
Real worlds
The number of known planets orbiting stars other than our own now stands at about 270.
For most of them, scientists know little more than the planet's mass and orbital properties.
Adam Showman of the Department of Planetary Sciences at the University of Arizona, US, said scientists were finally starting to move beyond simply discovering extrasolar planets to truly characterising them as worlds.
Dr Showman, who was not part of the study, said: "The discovery does not by itself have any direct implications for life except that it proves a technique which might potentially be useful for characterising the atmosphere of rocky planets when we finally start discovering them."
Excitement about finding other Earth-like planets is driven by the idea that some might contain life; or that perhaps, centuries from now, humans might be able to set up colonies on them.
The key to this search is the so-called "Goldilocks zone", an area of space in which a planet is "just the right distance" from its parent star so that its surface is not-too-hot or not-too-cold to support liquid water.
Creating a "phone-book for ET"
Team finds largest exoplanet yet
07 Aug 07 | Science/Nature
Planet-hunters set for big bounty
17 Feb 08 | Science/Nature
Water vapour found on exoplanet
Planet hunters spy distant haul
29 May 07 | Science/Nature
Prof Adam Showman | 科技 |
2017-09/1580/en_head.json.gz/447 | FDA Moves To Regulate Increasingly Popular E-Cigarettes By Rob Stein
Apr 23, 2014 TweetShareGoogle+Email A woman tries electronic cigarettes at a store in Miami.
Originally published on April 24, 2014 12:33 pm The Food and Drug Administration Thursday proposed regulating e-cigarettes for the first time. The agency unveiled a long-awaited rule that would give it power to oversee the increasingly popular devices, much in the way that it regulates traditional cigarettes. "It's a huge change," FDA Commissioner Margaret Hamburg told reporters in a briefing Wednesday, before the official announcement of the agency's plans. "We will have the authority as a science-based regulatory agency to take critical actions to promote and protect the health of the public." The proposal will be subject to public comment and further review by the agency before becoming final. But once that happens the rule would impose new restrictions, including: A ban on the sale of e-cigarettes to minors. A prohibition on distributing free samples. A ban on selling e-cigarettes in vending machines unless they are in places that never admit young people. A requirement that e-cigarettes carry warnings that they contain nicotine, which is addictive. E-cigarette manufacturers would be required to disclose the ingredients in their products. E-cigarettes are plastic or metal tubes about the size of a traditional cigarette that heat a liquid solution containing nicotine. That creates a vapor that users inhale. Their popularity has soared in recent years. Some have welcomed the trend as a way to prevent people from smoking traditional cigarettes, which are far more dangerous, and to help smokers quit. Others fear the devices will addict nonsmokers to nicotine and eventually lead to more people smoking. That has fueled calls for the FDA to assert its authority over the devices. Although e-cigarettes are generally considered much safer than traditional cigarettes, some fear that not enough research has been done to know what risks they may have. "We call the current marketplace for e-cigarettes the Wild Wild West," said Mitchell Zeller, who heads the FDA's Center for Tobacco Products. "We will be in a position to ensure that the products are as safe as they could possibly be." The FDA had previously attempted to regulate e-cigarettes, but that effort was thwarted in court. The agency recently signaled, however, that it planned to try again. Thursday's announcement will make that official. The proposal would also require any new e-cigarettes to get FDA approval before being sold, and demand that current products provide a justification for remaining on the market. The announcement stopped short of more aggressive regulation that some critics had called for, including restricting or banning fruit flavors and other sweeteners that may appeal to young people. It also won't restrict television advertising and online sales, as some had hoped. But Zeller and Hamburg said Thursday's move is the first step that could lead to such measures if the agency determines that those are needed. "This announcement starts the process that will give us the authority to actually get out there and regulate e-cigarettes," Hamburg said. The FDA also is proposing regulation of a variety of other tobacco-related products, including cigars, nicotine gels, pipe tobacco and water pipes. E-cigarette companies and anti-smoking advocates had just started to review the FDA's proposal. But at least initially the industry welcomed the FDA's plans, apparently relieved the agency had not gone further. "We are extremely relieved that all e-cigarette companies will be regulated, and forced to achieve and maintain the same high standards that Vapor Corp., and several of our responsible competitors, have been imposing on ourselves for years," said Jeffrey Holman, president and director of the Dania, Fla., company said in an email to Shots. Public health advocates generally welcomed the move as an important first step, but expressed disappointment that the agency had failed to regulate the devices more aggressively right away. "This action is long overdue," said Matthew Myers, president of the Campaign for Tobacco-Free Kids. "It is inexcusable that it has taken the FDA and the administration so long to act. This delay has had serious health consequences as these unregulated tobacco products have been marketed using tactics and sweet flavors that appeal to kids."Copyright 2014 NPR. To see more, visit http://www.npr.org/. Transcript DAVID GREENE, HOST: This is MORNING EDITION from NPR News. Good morning, I'm David Greene. STEVE INSKEEP, HOST: And I'm Steve Inskeep. Later today the Food and Drug Administration plans to make a big announcement about e-cigarettes. The agency will propose that it begin regulating smokeless devices just like traditional cigarettes. And we're going to talk about this with NPR health correspondent Rob Stein, who's in our studios. Rob, good morning. ROB STEIN, BYLINE: Good morning, Steve. INSKEEP: You've got something in your hands there. What is it? STEIN: That's right. I brought one of the see cigarettes with me. And, as you can see, it looks a lot like a cigarette holder. It looks like kind of a cross between and maybe a cigarettes and a pen. INSKEEP: Yeah, like a heavy kind of pen. STEIN: Yeah. INSKEEP: Like a fancy pen. STEIN: Yeah, and that's what these things are. They're usually made out of plastic or metal. They're tubes. And what they are, they have a battery in them that heats up a chamber that contains a fluid that contains nicotine in it. And that creates a vapor that people can inhale and they blow out something that looks like smoke. INSKEEP: We just remind people, nicotine is addictive substance in cigarettes. And so what you're getting here is the sensation of smoking and the nicotine without the actual smoke. STEIN: Right, that's the idea. This gives people sort of their nicotine fix without the dangers of inhaling burning tobacco and inhaling smoke from the burnt tobacco, and all the dangerous chemicals that that includes. INSKEEP: Sounds like a good idea, but it's been controversial, hasn't it? STEIN: Yes, it's been hugely controversial. On the one hand, some people say, look, this is a really great thing. It's much, much less dangerous than smoking cigarettes. And it can prevent people from starting to smoke cigarettes. It might help smokers quit smoking. But on the other hand, there are other people who say hold on a minute. This could be really dangerous. It's starting to make smoking look cool again. It could hook a whole new generation on smoking. And it could make it actually harder for people to quit smoking. INSKEEP: Does it make smoking seem cool again? I mean that's - it's like a fancy styling on the side of that. I'm sure you can do a lot of things you couldn't do with a paper cigarette. STEIN: Yeah, it comes in all kinds of fancy colors and designs. And there's been a lot of really aggressive marketing on television with the celebrities hawking these things. And so there's a lot of concern that - and there are these vaping lounges that people can go to that have become really popular. INSKEEP: Oh, and you said television. These can be advertised on TV in the way the traditional cigarettes cannot. STEIN: Exactly, that's one of the many ways that these things are different than regular cigarettes, is they're advertising them on TV. You can buy them online. And these are all things that you can't do with regular tobacco cigarettes. INSKEEP: OK, so the FDA is proposing to regulate them. What is it the Food and Drug Administration wants to do is? STEIN: Well, basically the agency for the first time is asserting its authority over these devices. And that will do a whole bunch of things right away. Once - assuming this is a proposal, that's important to remember - but assuming this goes into effect, the things that will happen pretty much right away, most of them are aimed at preventing kids from using these things. Like it will ban sales of these devices to the minors. It would ban sales in vending machines in most places. You couldn't distribute free samples, which is appealing to kids. And it will require the manufacturers to disclose the ingredients in these things for the first time. INSKEEP: Meaning we don't know what's in them. STEIN: That's one of the big questions, we really don't know what is in that. And that's one of the questions about these devices is really how much safer are they. And what's in here that could be causing health problems that we are unaware of? INSKEEP: OK, two quick questions. First, how is the industry responding to this move to regulate them? STEIN: So far they seem pretty positive. They seem basically kind of relieved that the agency didn't go a lot farther than some people had hoped. And so so far they're saying that this could be, you know, a good thing to have some regulations to sort of bring the industry into some sort of compliance. On the other hand, the antismoking people, you know, they're saying this is a really good first step, but there's a lot of things they wish the agency had done, like ban marketing in television advertising. And also the flavorings - these things come in fruit flavors and all kinds of other flavors that they wish the agency would ban. And the agency says they might get to that, they're just not there yet. INSKEEP: In a couple of seconds, is this a move towards regulations then that, according to antismoking activists, is actually favoring the industry 'cause they're doing so little? STEIN: Well, that's what some people are saying, that they think this actually could help the industry consolidated itself and really establish itself as a thriving industry. But they think, well, at least they did this and maybe down the road the agency might do more tougher stuff. INSKEEP: Rob, thanks for bringing that e-cigarette by. STEIN: No problem. INSKEEP: NPR's Rob Stein. Transcript provided by NPR, Copyright NPR.TweetShareGoogle+EmailView the discussion thread. © 2017 Northwest Public Radio | 科技 |
2017-09/1580/en_head.json.gz/582 | Recent Commentsujm on Quantum physics meets the philosophy of mindCharles San on Quantum physics meets the philosophy of mindpaul brast on Quantum physics meets the philosophy of mindPeter Jones on Pigliucci and Albert slamming Krauss, Yours Truly slamming Albert and (by implication) KraussJack Sarfatti on How the Hippies Saved Physics Tag: Big Sur How the Hippies Saved Physics From a review by George Johnson of How the Hippies Saved Physics: Science, Counterculture, and the Quantum Revival by David Kaiser (W. W. Norton & Company, 2011). Titled What Physics Owes the Counterculture, it was published on June 17, 2011 in the NYT Sunday Book Review.
“What the Bleep Do We Know!?,” a spaced-out concoction of quasi physics and neuroscience that appeared several years ago, promised moviegoers that they could hop between parallel universes and leap back and forth in time — if only they cast off their mental filters and experienced reality full blast. Interviews of scientists were crosscut with those of self-proclaimed mystics, and swooping in to explain the physics was Dr. Quantum, a cartoon superhero who joyfully demonstrated concepts like wave-particle duality, extra dimensions and quantum entanglement. Wiggling his eyebrows, the good doctor ominously asked, “Are we far enough down the rabbit hole yet?”…
Dr. Quantum was a cartoon rendition of Fred Alan Wolf, who resigned from the physics faculty at San Diego State College in the mid-1970s to become a New Age vaudevillian, combining motivational speaking, quantum weirdness and magic tricks in an act that opened several times for Timothy Leary. By then Wolf was running with the Fundamental Fysiks Group, a Bay Area collective driven by the notion that quantum mechanics, maybe with the help of a little LSD, could be harnessed to convey psychic powers. Concentrate hard enough and perhaps you really could levitate the Pentagon.
In “How the Hippies Saved Physics: Science, Counterculture, and the Quantum Revival,” David Kaiser, an associate professor at the Massachusetts Institute of Technology, turns to those wild days in the waning years of the Vietnam War when anything seemed possible: communal marriage, living off the land, bringing down the military with flower power Why not faster-than-light communication, in which a message arrives before it is sent, overthrowing the tyranny of that pig, Father Time?
Members of the Fundamental Fysiks Group, circa 1975; clockwise from left: Jack Sarfatti, Saul-Paul Sirag, Nick Herbert and Fred Alan Wolf
That was the obsession of Jack Sarfatti, another member of the group. Sarfatti was Wolf’s colleague and roommate in San Diego, and in a pivotal moment in Kaiser’s tale they find themselves in the lobby of the Ritz Hotel in Paris talking to Werner Erhard, the creepy human potential movement guru, who decided to invest in their quantum ventures. Sarfatti was at least as good a salesman as he was a physicist, wooing wealthy eccentrics from his den at Caffe Trieste in the North Beach section of San Francisco.
Other, overlapping efforts like the Consciousness Theory Group and the Physics/Consciousness Research Group were part of the scene, and before long Sarfatti, Wolf and their cohort were conducting annual physics and consciousness workshops at the Esalen Institute in Big Sur.
Fritjof Capra, who made his fortune with the countercultural classic “The Tao of Physics” (1975) was part of the Fundamental Fysiks Group, as was Nick Herbert, another dropout from the establishment who dabbled in superluminal communication and wrote his own popular book, “Quantum Reality: Beyond the New Physics” (1985). Gary Zukav, a roommate of Sarfatti’s, cashed in with “The Dancing Wu Li Masters” (1979). I’d known about the quantum zeitgeist and read some of the books, but I was surprised to learn from Kaiser how closely all these people were entangled in the same web […]
Posted on August 15, 2011Categories BlogTags Big Sur, David Kaiser, Dr. Quantum, Esalen Institute, Fred Alan Wolf, Fritjof Capra, Fundamental Fysiks Group, Gary Zukav, George Johnson, hippies, How the Hippies Saved Physics, Jack Sarfatti, New York Times Sunday Book Review, Nick Herbert, physics, quantum mechanics, quantum physics, quantum theory, Saul-Paul Sirag, Werner Erhard, What the Bleep Do We Know!?2 Comments on How the Hippies Saved Physics Proudly powered by WordPress | 科技 |
2017-09/1580/en_head.json.gz/664 | Fargo Jet Center, Weather Modification Provide Cloud Seeding and Atmospheric Research
by Harry Weisberger
- October 11, 2011, 3:45 AM
Fargo Jet Center, a full-service FBO located at Hector International Airport, is an Avfuel dealer and operates an FAA Part 145 repair station offering aircraft maintenance and avionics services. Mark Twain is said to have remarked, “Everybody talks about the weather but nobody does anything about it,” but Fargo Jet Center (FJC) and Weather Modification, Inc. (WMI) qualify as exceptions. The two, joined at the hip on Hector International Airport, Fargo, N.D., are actually doing a lot about the weather.
Representatives of FJC and WMI are at the Fargo Jet exhibit at the Avfuel booth (No. N5121). The sister companies work closely together, according to Darren Hall, FJC marketing vice president. WMI partners with the National Center for Atmospheric Research (NCAR) on a variety of weather research and modification programs and also fields aircraft for customers needing cloud seeding and other special-mission applications. Fargo Jet Center evolved out of WMI in 1995. “WMI is Fargo Jet Center’s largest customer,” he said. “We do all its aircraft maintenance and modifications. The FJC presence around the world builds its business, and WMI projects generate work for FJC.
“We believe the professionals at WMI are the most experienced and expert in the atmospheric sciences and weather modification,” Hall said. WMI has been modifying and operating aircraft for cloud seeding and atmospheric research since 1961. Today it maintains and operates a fleet of more than 35 twin-engine aircraft in various configurations to meet the needs of every client. The two companies are currently working on a supplemental type certificate to give the Honeywell TFE731 turbofan engine a heavier duty, hail-resistant inlet.
“We use several aircraft models in our own operations, although we can adapt our weather equipment to virtually any platform for specific customer needs,” Hall said. WMI has STCs for cloud-seeding equipment on the Hawker 400; King Air 350, 200 and C90; and the two companies can provide either already modified aircraft for specific missions or modify customer aircraft to perform the operations required. WMI also performs requested maintenance and/or upgrades during the modification process. WMI/FJC modification specialists accommodate missions that include VIP transport, air ambulance, aerial photography, remote sensing, telemetry, environmental monitoring, cloud seeding, atmospheric chemistry and measurement, all with 24/7 worldwide flight operations and maintenance support.
Fargo Jet Center ranked eighth this year in the 2011 AIN FBO Survey. The FBO is an Avfuel dealer and operates an FAA Part 145 repair station offering aircraft maintenance and avionics services as well as aircraft sales, a Cessna Pilot Center flight school and an Argus Gold-rated charter service. Soon, FJC will host Hector International Airport’s U.S. Port of Entry with a new permanent office connected to the FBO terminal. Currently, Hall said, “Ninety-nine percent of [international] passengers clear through our terminal with on-site customs. The new facility will serve all international departures and arrivals right here.”
Jim Sweeney, FJC’s president and his brother Pat, CEO, are both University of North Dakota graduates. They are closely linked to the UND aviation studies program and have hired more than 350 intern pilots in connection with a UND class in weather modification. Hall said FJC plans “to resume an international training program to meet a worldwide pilot shortage. We just took delivery of the first of 10 Cessna 162 Skycatcher trainers we will accept through 2012. They’ll go into flight schools at both Fargo and St. Paul, Minnesota.”
FBOs Sponsor Content: Hawker Pacific February 2017 A Leading Player Built on a Powerhouse of Tradition Built on a powerhouse of tradition http://www.ainonline.com/aviation-news/business-aviation/2011-10-11/fargo-jet-center-weather-modification-provide-cloud-seeding-and-atmospheric-research | 科技 |
2017-09/1580/en_head.json.gz/709 | Partnering for Smart Growth
A small biopharma expands internationally to become a top value creator.
A small California-based biopharma company had recently gone public and become profitable. But it faced a problem common among companies of its size: how to scale resources quickly in order to grow earnings and improve the top line.The company called in BCG to help it devise an ambitious growth strategy that built on past success and ensured that the firm still had a tremendous impact on the diseases in which they were active.In collaboration with BCG, the company evaluated its commercial organization. It became apparent that the business had outstripped the firm’s capacity to go to market effectively, to do life cycle management on its products, and to effectively engage with payers. BCG’s team worked with the firm to restructure the organization using best practices from biopharma and other industries, with the goal of increasing capacity for international expansion.BCG worked with the company to restructure its R&D operation to remove barriers to competition in key markets, and developed a better model to bring new medicines to market. The company also assessed and integrated acquisitions to spur top-line growth.BCG also helped the company scale across new markets in different countries and therapeutic areas. The company added a new therapeutic area to drive the growth necessary for shareholder value creation.To enable a company-wide transformation, BCG worked with all levels of management, scientists, and personnel to help build lasting capabilities. BCG helped the company launch a major product born from an acquisition that became one of the industry’s most successful products of all time.The result of the nearly 15-year collaboration: the company has grown from several hundred million in revenue to almost $10 billion. The firm has ranked as one of the highest value-creating biopharma companies for the past several years.
The result of the nearly 15-year collaboration: an increase in revenue from a few hundred million to almost $10 billion.
BCG collaborated with the company for 15 years.
The company achieved more than 1,000% top-line growth.
The company has nearly $10 billion in current revenue. | 科技 |
2017-09/1580/en_head.json.gz/730 | Thermal Gradient, Inc. Announces Breakthrough in Fast PCR Device Technology 2/7/2011 8:09:37 AM
ROCHESTER, N.Y., Feb. 7, 2011 /PRNewswire/ -- Thermal Gradient, a Rochester, NY biotech company, announces a breakthrough in its fast PCR technology for molecular diagnostics. PCR (polymerase chain reaction) is the preferred critical step in virtually all forms of DNA testing.The company's latest generation of amplification devices has successfully developed and demonstrated the performance of PCR at speeds many times faster than conventional methods, achieving 8.5 decades of amplification (greater than 300 million times) in just eight minutes. For the first time, these simple devices are being made from low cost materials suitable for high volume manufacturing processes. The company believes that it now has the best platform for molecular testing that addresses ease of use, fast performance, and low cost operation.The current generation of devices, developed under an NIH grant(1), are aimed at rapid point-of-care HIV testing in low resource settings. As such, they need to be easy to use, appropriate for field deployment, work very fast, and be produced in very large quantities at very low unit cost. By demonstrating that high performance can be achieved in devices fabricated with materials suitable for mass production processes the company has taken a major step toward those goals. With this early success, the company has committed to its first mass producible injection molded design. Testing of these units is expected early in January of 2011.These amplification devices are used in a cartridge that is already in development. This cartridge will be a single use, disposable unit that integrates sample preparation from whole blood, PCR amplification, and real-time detection. The instrument that will process these cartridges is also under development.(1) Award Number R44AI089389 from the National Institute Of Allergy and Infectious Diseases. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Allergy And Infectious Diseases or the National Institutes of Health.CONTACT: Joel Grover1-585-248-9598jgrover@thermalgradient.comSOURCE Thermal Gradient Inc. | 科技 |
2017-09/1580/en_head.json.gz/741 | Linus Pauling: A Life...
Linus Pauling: A Life in Science and Politics
Linus Pauling: A Life in Science and Politics BMJ
doi: https://doi.org/10.1136/bmj.312.7039.1170
(Published 04 May 1996)
Mikulas TeichTed Goertzel, Ben Goertzel Basic Books (Harper Collins) £18.99, pp 300 ISBN 0 465 00672 8Among 20th century scientists Linus Pauling (1901-94) occupies a conspicuous position. For one thing, he was a recipient of two Nobel prizes, for chemistry (1954) and peace (1962), and the authors of this highly readable, unauthorised, biography tell us that “he felt he deserved a third.”It is also a useful book for understanding the politics—in the broadest sense of the word—of the Nobel prize awards. Pauling was given the Nobel prize in chemistry for “research into the nature of the chemical bond and its application to the structure of complex substances” which, seminally entered into in the 1920s, culminated with the celebrated book The Nature of the Chemical Bond and the Structure of Molecules and Crystals in 1939. So it took a while before the highest scientific accolade was bestowed on Pauling who, in 1954, was a controversial figure both in the Soviet Union and the United States. In the former his name was associated with the theory of resonance, deemed to be antimaterialist by some vigilante scientists and philosophers. In the latter the State Department decided that it was not in the interest of the United States for Pauling to go abroad. On the basis of information from McCarthy's House Un-American Activities Committee his passport was not renewed. Within this context the authors write: “In making the award when they did, the Swedes asserted their neutrality by defying both the Soviet Stalinists and the American McCarthyists.” As to the Nobel prize for peace, its award was announced on “the same day [10 October 1963] the partial nuclear test ban treaty was signed by the three nuclear powers.”From 1966 until his death Pauling became involved in what he called “orthomolecular medicine”—that is, “the maintenance of health and cure of disease by regulating the concentration in the body of substances naturally found there.” He became an advocate of megavitamin treatments and believed that vitamin C offers the best cure against the common cold—he himself took about 10 g a day, increasing the dosage to as much as 50 g when he felt a cold coming on. “The issue of vitamins and health,” we read, “continues to be controversial….There is no compelling experimental evidence that megadoses of water-soluble vitamins are useful….The jury is still out.”In addition to Pauling's engagement in megavitamin treatments, readers of the BMJ might find it of interest that he participated, in 1953 and 1962, in two psychological studies of the personalities of scientists, relying in part on the Rorschach test. The results of the testing, in which Pauling was “an enthusiastic participant,” are discussed by the authors in the appendix.According to the distinguished American scientist Martin Kamen (he discovered carbon-14 and was falsely accused of espionage activities by the McCarthy committee), the biography of Pauling under review “may well become the definitive treatment of this remarkable man.” It certainly offers a pleasurable introduction to him and should prove a valuable beginning for future historical work on his life.—MIKULAS TEICH, Robinson College, CambridgeView Abstract Get access to this article and to all of thebmj.com for 14 days
Teich Mikulas. Linus Pauling: A Life in Science and Politics BMJ 1996; 312 :1170 BibTeX (win & mac)Download
Request permissions Author citationArticles by Mikulas Teich
You are going to email the following Linus Pauling: A Life in Science and Politics | 科技 |
2017-09/1580/en_head.json.gz/831 | Print Email Font ResizeCU-Boulder, NOAA study uncovers oil and gas emission's 'chemical signature'Study finds that more than half of ozone-forming pollutants in Erie come from drilling activityBy John Aguilar Camera Staff WriterPosted:
Oil and gas wells shown near Erie Community Park in March 2012. (
Matthew Jonas
Emissions from oil and natural gas operations account for more than half of the pollutants -- such as propane and butane -- that contribute to ozone formation in Erie, according to a new scientific study published this week. The study, the work of scientists at the Cooperative Institute for Research in Environmental Sciences at the University of Colorado, concluded that oil and gas activity contributed about 55 percent of the volatile organic compounds linked to unhealthy ground-level ozone in Erie. Key to the findings was the recent discovery of a "chemical signature" that differentiates emissions from oil and gas activity from those given off by automobiles, cow manure or other sources of volatile organic compounds. "There were very, very few data points that did not fall on the natural gas line," Jessica Gilman, research scientist at CIRES and lead author of the study, said Wednesday. "We had a very strong signature from the raw natural gas." CIRES is a joint institute of CU and the National Oceanic and Atmospheric Administration. Its study was published online Monday in the journal Environmental Science and Technology. Emissions detected in Boulder The air quality monitoring effort, dubbed the Boulder Atmospheric Observatory, was conducted in February and March of 2011 on a tower set up a couple of miles east of downtown Erie. Advertisement
It showed that, on average, Erie had highly elevated levels of propane in its air -- 10 times the levels found in famously smoggy Pasadena, Calif., and four times those in Houston. The results prompted town leaders last year to place a six-month moratorium on new drilling applications while they gathered additional information on the fast-growing industry. But trying to determine exactly how much of Erie's propane was due to the thousands of gas wells located in and around town, and how much was due to the effects of being part of a major metropolitan area, was inexact at best. Until now. "What we saw at the Boulder Atmospheric Observatory was the mixing of two sources -- oil and gas and vehicles," Joost de Gouw, research physicist at CIRES, said. "For each compound, we can separate how much came from oil and gas and how much came from vehicles." The researchers arrived at the unique chemical signature by analyzing the chemical makeup of all their air samples, characterizing 53 different types of volatile organic compounds and comparing the results to the composition of raw natural gas. "We estimate 55 percent of the compounds contributing to ozone formation in Erie are from oil and gas," de Gouw said. And it's not just Erie that is affected by oil and gas activity, which has exploded in recent years in the gigantic Wattenberg Gas Field northeast of Denver. The study showed that scientists found the telltale signs of drilling emissions in air samples taken in Fort Collins and Boulder, albeit in lesser amounts. "Air pollution can travel many, many miles downwind from the source," Gilman said. "Air doesn't stop at any border." Health effects in dispute But whether emissions from oil and gas activity are endangering human health on a wide scale continues to be fiercely debated. Multiple families in Erie and around the state have complained that living so close to wells has made them sick, with nosebleeds, asthma and headaches as common symptoms. But two studies commissioned by the town last year concluded that the levels of propane in Erie weren't concerning. One environmental consulting firm concluded that even a lifetime exposure to the concentrations cited in the CIRES study would have a "low" risk of causing adverse health effects. A second firm stated that the town's propane levels were "1,000-fold or more below those considered to be of health concern." Doug Flanders, spokesman for the Colorado Oil and Gas Association, wrote in an email Wednesday that Denver's air quality is "much better" than that of Houston or Los Angeles. "December 2012 EPA data show Denver-area smog levels are well below those of Houston or Los Angeles," he wrote. Gordon Pierce, technical services program manager for the Colorado Department of Public Health and Environment's air pollution control division, said the state put in place stricter ozone controls in 2006 and 2008 for the oil and gas industry. New control requirements were established for condensate tanks and new reporting and record-keeping requirements were also implemented. Late last year, Pierce said, the state adopted new EPA rules regarding emissions at gas wells. But he said his agency will continue to monitor the effects of the industry in Colorado, which now has more than 50,000 active wells. There may not be proven health effects from individual volatile organic compounds, Pierce said, but when those compounds are combined with nitrogen oxides from vehicle tailpipes and baked in the sun, they form ozone. At ground level, ozone can cause breathing difficulties and eye irritations, especially among the young and elderly. He said it's possible that if a strong enough link is established between oil and gas operations and high levels of ozone, the state could pursue stronger regulatory measures. "We're always looking at different strategies for reducing ozone," Pierce said. Pointing to Weld County In the meantime, anti-drilling activists such as Jen Palazzolo plan to use the CIRES study's findings to once again put pressure on Erie's elected leaders to be tougher on oil and gas operators in town. Palazzolo is a leader of Erie Rising, a group that has strongly resisted oil and gas drilling in Erie. "The study supports information that we've been trying to put out and that the industry has been trying to shoot down," she said. "There's no way you can argue against the fact that oil and gas contributes to ozone precursors." After its drilling moratorium expired in September, Erie entered into memoranda of understanding with two operators that require them to use steel-rim berms around tanks and separators, closed-loop systems for drilling and completion operations, and a more effective vapor recovery unit for new wells. The companies also agreed not to use hydraulic fracturing fluid products that contain diesel, 2-Butoxyethanol or benzene. But Palazzolo and others opposed to the industry say those agreements don't do nearly enough to protect the public health. "It's time for us to re-address this with the Erie Board of Trustees and ask them if they are going to continue approving the number of oil and gas operations that are planned in town," she said. Erie Trustee Mark Gruber said the town has done just about all it can on the issue, given the reality that control over the industry rests with the state and not local communities. Furthermore, he wonders how effective additional restrictions would be given the fact that the science indicates that volatile organic compounds are windborne and travel long distances. "By and large, the contamination we see -- if it's in Erie, I'm going to point to Weld County," he said. "We're between a rock and a hard place. We've got 300 wells, they've got 19,000. We can't build a wall to stop the emissions coming from Weld County." The county had 19,799 active wells as of last week, and in 2012 saw 1,826 well permits approved, representing 48 percent of all permits approved statewide that year. Weld County Commissioner Sean Conway said his county works closely with state health authorities on ensuring the cleanest footprint from oil and gas operations. Ninety-five to 98 percent of emissions are being captured by the industry now, he said, and ozone levels have actually decreased in his county over the last few years. Still, he said he welcomes the news this week that emissions from the oil and gas industry can be specifically traced by their chemical makeup. He said it could serve as a powerful tool for dealing with the industry from a factual standpoint, rather than an emotional one. "If they are able to identify this now, it will give us a good starting point in identifying what are the actual impacts of this and how we go about ensuring public health and safety," Conway said. Contact Camera Staff Writer John Aguilar at 303-473-1389 or aguilarj@dailycamera.com.Print Email Font ResizeReturn to Top RELATED | 科技 |
2017-09/1580/en_head.json.gz/890 | Update: Kepler Mission Sets Out to Find Planets Using CCD Cameras
Jansen Ng (Blog) - March 8, 2009 3:59 PM
Part of Kepler's CCD array (Source: NASA)
Kepler spacecraft will detect small planets close to the size of earth.
Humanity has wondered about the heavens above since before recorded history. Recently, the discovery of hundred of planets in other star systems has sparked extraordinary interest in determining the odds of extraterrestrial life.
The Kepler mission will seek to explain one part of the puzzle by observing the brightness of over 100,000 stars over the next forty-two months. In doing so, it will be able to track earth-sized planets, generating future targets of interest for more advanced future space observatories like the Terrestrial Planet Finder and the Laser Interferometer Space Antenna.
Nearly all of the extrasolar planets detected thus far are giant planets the size of Jupiter or larger. Kepler will look for planets 30 to 600 times less massive, closer to the size of Earth and more likely to support life.
All planets in stable orbits transit across their star during their own unique annual cycle. This causes a dip in the star's apparent magnitude for an observer in the same plane. By timing these transits, the orbit and length of year can be calculated. The orbit of a planet can be used to determine if it lies within the "zone of life", where it is close enough to the sun to support liquid water, yet far enough that potential life is not destroyed by it.
"Kepler's mission is to determine whether Earth-size planets in the habitable zone of other stars are frequent or rare; whether life in our Milky Way galaxy is likely to be frequent or rare", said William Borucki, NASA's Principal Investigator on the Kepler Mission.
While Kepler will only focus on a small area of the sky, its results will be enough to enable accurate estimates of the number of earth-sized planets in our galaxy.
Kepler will use an array of 42 CCD (charge-coupled device) cameras, each measuring 50x25 mm. With a resolution of 1024x2200 each, Kepler has a total resolution of approximately 95 megapixels.
CCD cameras are used in most digital cameras and optical scanners. They are also used in astronomy and in night-vision devices due to their sensitivity to the ultraviolet and infrared ranges of light.
Mission operations will be conducted by NASA's Ames Research Center in Moffett Field, California, and are included as part of the $600 million total mission cost. Ames will contact the Kepler spacecraft twice a week using the X-band for command updates as well as system status updates. Scientific data is only downloaded once a month using the Ka-band, at a data rate of up to 4.33 Mb/s. To conserve bandwidth, Kepler will conduct partial analysis on board and only transmit data of interest to researchers.
The Kepler spacecraft will be launched at 2250 Eastern Standard Time from Cape Canaveral Air Force Station in Florida. It will use the Delta II multi-stage rocket, which has flown 140 missions while achieving a success rate of almost 99 percent.
Instead of a typical earth orbit, it will launch Kepler into an earth trailing orbit in order to block light from the sun and the moon. This orbit also avoids gravitational perturbations inherent in an Earth orbit, thus allowing for additional platform stability.
The Kepler Mission is named for Johannes Kepler, best known for his Laws of Planetary Motion.
The Kepler spacecraft was launched successfully aboard a Delta II rocket in the D2925-10L launch configuration from pad 17B at 22:49:57 EST on Friday March 6th. The three-stage launch vehicle had nine additional solid rocket boosters, six for the first stage and three for the second stage. The third stage boosted the Kepler payload to its heliocentric orbit trailing Earth. Two months of testing and systems verification will occur for the next two months before Kepler begins its inspiring mission.
"We don't know how to make a $500 computer that's not a piece of junk." -- Apple CEO Steve Jobs
Distant Planet Has High Temperature of 2240F
First Pictures of Extrasolar Planets Taken | 科技 |
2017-09/1580/en_head.json.gz/915 | Home > Apple > Amazon cuts cost of hit songs to 69 cents Amazon cuts cost of hit songs to 69 cents By
Song downloads are cheap these days by any standards. And now they’ve just got a little bit cheaper.
In what’s being seen by many as a bold move, Amazon has just knocked 20 cents off the cost of some of its MP3s in a bid to wrestle some of the music download business away from Apple, whose iTunes store currently dominates the market. The reduced price means that the e-commerce giant is now offering some top-selling songs for the bargain bucket price of just 69 cents.
A report in the LA Times points out that Amazon’s market share of the business has been languishing for the last two years at about 10 percent, whereas Apple’s iTunes continues to enjoy a share of some 70 percent.
Time will tell if the price cut creates some new loyal customers and causes a shift in the market. Speaking to the Times, Russ Crupnick, a digital music analyst at the NPD Group, said, “The average music consumer spends $46 a year on digital music, which is half of what it was last year. The question is not whether you can sell a 69-cent track. It’s whether you can get a customer to spend $69.”
Amazon are clearly intent on trying to upset the Apple music cart – the price cut comes off the back of Amazon’s recent launch of its cloud-based music locker service in March (something which Apple has yet to offer, though may be coming soon), though admittedly that service has been experiencing some problems of late.
The songs available for 69 cents at Amazon’s music store include recent releases such as Lady Gaga’s Judas, Kelly Rowland’s Motivation and Gorillaz’ Revolving Doors. The same songs are selling on Apple’s iTunes store for $1.29.
Let’s wait and see if Amazon’s latest move attracts any music lovers from the iTunes store, or if indeed whether Apple starts making some price cuts of its own, signalling the start of a price war. | 科技 |
2017-09/1580/en_head.json.gz/1102 | New Tech Forum
By John Hugg
About | Emerging tech dissected by technologists
Fast data: The next step after big data
Open source tools help companies process data streams. To bring in complex queries and transactional capabilities, VoltDB's John Hugg suggests adding an in-memory NewSQL data store
Beyond Storm for streaming data applications
Review: HBase is massively scalable -- and hugely complex
Bossie Awards 2015: The best open source big data tools
The way that big data gets big is through a constant stream of incoming data. In high-volume environments, that data arrives at incredible rates, yet still needs to be analyzed and stored.John Hugg, software architect at VoltDB, proposes that instead of simply storing that data to be analyzed later, perhaps we've reached the point where it can be analyzed as it's ingested while still maintaining extremely high intake rates using tools such as Apache Kafka.-- Paul Venezia[ Find out which machine learning and deep learning frameworks are for you with InfoWorld Test Center's comparison of TensorFlow, Spark MLlib, Scikit-learn, MXNet, Microsoft Cognitive Toolkit, and Caffe. | Get a digest of the day's top tech stories in the InfoWorld Daily newsletter. ]Less than a dozen years ago, it was nearly impossible to imagine analyzing petabytes of historical data using commodity hardware. Today, Hadoop clusters built from thousands of nodes are almost commonplace. Open source technologies like Hadoop reimagined how to efficiently process petabytes upon petabytes of data using commodity and virtualized hardware, making this capability available cheaply to developers everywhere. As a result, the field of big data emerged. A similar revolution is happening with so-called fast data. First, let's define fast data. Big data is often created by data that is generated at incredible speeds, such as click-stream data, financial ticker data, log aggregation, or sensor data. Often these events occur thousands to tens of thousands of times per second. No wonder this type of data is commonly referred to as a "fire hose."When we talk about fire hoses in big data, we're not measuring volume in the typical gigabytes, terabytes, and petabytes familiar to data warehouses. We're measuring volume in terms of time: the number of megabytes per second, gigabytes per hour, or terabytes per day. We're talking about velocity as well as volume, which gets at the core of the difference between big data and the data warehouse. Big data isn't just big; it's also fast.The benefits of big data are lost if fresh, fast-moving data from the fire hose is dumped into HDFS, an analytic RDBMS, or even flat files, because the ability to act or alert right now, as things are happening, is lost. The fire hose represents active data, immediate status, or data with ongoing purpose. The data warehouse, by contrast, is a way of looking though historical data to understand the past and predict the future.Acting on data as it arrives has been thought of as costly and impractical if not impossible, especially on commodity hardware. Just like the value in big data, the value in fast data is being unlocked with the reimagined implementation of message queues and streaming systems such as open source Kafka and Storm, and the reimagined implementation of databases with the introduction of open source NoSQL and NewSQL offerings. Capturing value in fast data The best way to capture the value of incoming data is to react to it the instant it arrives. If you are processing incoming data in batches, you've already lost time and, thus, the value of that data.To process data arriving at tens of thousands to millions of events per second, you will need two technologies: First, a streaming system capable of delivering events as fast as they come in; and second, a data store capable of processing each item as fast as it arrives.Delivering the fast data Two popular streaming systems have emerged over the past few years: Apache Storm and Apache Kafka. Originally developed by the engineering team at Twitter, Storm can reliably process unbounded streams of data at rates of millions of messages per second. Kafka, developed by the engineering team at LinkedIn, is a high-throughput distributed message queue system. Both streaming systems address the need of processing fast data. Kafka, however, stands apart.Kafka was designed to be a message queue and to solve the perceived problems of existing technologies. It's sort of an über-queue with unlimited scalability, distributed deployments, multitenancy, and strong persistence. An organization could deploy one Kafka cluster to satisfy all of its message queueing needs. Still, at its core, Kafka delivers messages. It doesn't support processing or querying of any kind. | 科技 |
2017-09/1580/en_head.json.gz/1104 | Sciences Breakthrough of the Year: Watching evolution in action
Science honors the top ten research advances of 2005
Evolution has been the foundation and guiding theory of biology since Darwin gave the theory its proper scientific debut in 1859. But Darwin probably never dreamed that researchers in 2005 would still be uncovering new details about the nuts and bolts of his theory -- how does evolution actually work in the world of influenza genes and chimpanzee genes and stickleback fish armor? Studies that follow evolution in action claim top honors as the Breakthrough of the Year, named by Science and its publisher AAAS, the nonprofit science society. In 2005, scientists piled up new insights about evolution at the genetic level and the birth of species, including information that could help us lead healthier lives in the future. Ironically, these often-startling discoveries occurred in a year when backers of "intelligent design" and other opponents of evolution sought to renew challenges to this fundamental concept.
This milestone, plus nine other research advances, make up Sciences list of the top ten scientific developments in 2005, chosen for their profound implications for society and the advancement of science. Sciences Top Ten list appears in the 23 December 2005 issue of the journal Science. Many of this years breakthrough studies followed evolution at the genetic level. In October this year, an international team of researchers unveiled a map of the chimpanzee genome. Scientists are already poring over the chimpanzee genome and another international effort, the biggest map to date of single-letter variations in the human genetic sequence, hoping to get a better glimpse of the human species evolutionary history. The two studies give scientists new material for studying conditions from AIDS to heart disease, and may lay the groundwork for a future of personalized genetic medicine.
This years sequencing of the 1918 pandemic flu virus could have a more immediate impact on medicine. The amazing story of flu genes preserved in permafrost and painstakingly reconstructed has a chilling coda: the deadly flu seems to have started out as purely a bird virus. Understanding the evolution of last centurys deadly bird flu may help us predict and cope with the current bird flu threat.
Other studies showed how small changes in DNA can trigger dramatic evolutionary events. Researchers found that a single genetic change can be all it takes to turn one species into many, as in the case of the Alaskan stickleback fish that lost its armor and evolved from an ocean-loving species to a variety of landlocked lake dwellers. Beyond the genome, researchers watched evolution in action among a number of animals, from caterpillars to crickets, and found that behavioral differences such as what to eat and when to mate may be enough to turn a single population into two species. These painstaking observations and other experiments showed that evolutionary studies are as relevant to 2005 as they were to 1859. Science also salutes nine other scientific achievements of 2005. Planetary Safaris: With spacecraft at or on the way to the moon, Mercury, Venus, Mars, a comet, an asteroid, Saturn, and the very edge of the solar system, planetary discovery soared in 2005. The high point in a year of highlights may be the landing of the European spacecraft Huygens on Titan, Saturns largest moon. Huygens trip to Titan revealed a world where infrequent but drenching rains of liquid methane shape the land and participate in a fascinating hydrologic cycle.
A Rich Year for Plants: Several key molecular cues behind flowering and other plant mysteries and surprises came to light in 2005. For example, plant molecular biologists pinned down the identity of a signal that initiates the seasonal development of flowers. Other research focused on a gene involved in stimulating flowering, and another study highlighted a surprising cache of RNA.
The Nature of Neutron Stars: In 2005, new instruments yielded vivid insights into the most violent behaviors of neutron stars. A short, intense pulse of radiation from near the center of the Milky Way, recorded on 27 December 2004, may be the result of a short gamma ray burst -- a rapid merger of two ancient neutron stars or a neutron star and a black hole. Brain Wiring and Disease: Several studies in 2005 suggest that diseases such schizophrenia, Tourette syndrome, and dyslexia are rooted in "faulty wiring" of the brains neural circuitry during development in the womb.
Where Did Earth Come From?: This year, researchers took another look at Earth rocks and meteorites that resemble the starting material of the solar system and found that their atoms were significantly different. So where did Earth get its building blocks? Some scientists now say early Earth materials come from a different part of the solar system, while others say parts of early Earth are just sunk deep in the planet, hidden from view.
Key Proteins Close-up: The most detailed molecular portrait to date of a voltage-gated potassium channel was unveiled in 2005. These channels, gatekeeper proteins that usher potassium ions in and out of cells, are as key to nerve and muscle functioning as transistors are to computers. Changing Climate of Climate Change?: In 2005, evidence linking humans to global warming continued to accumulate and U.S. politicians began to take notice. From the warming of deep ocean waters and increased frequencies of the most intense tropical cyclones to continued reductions in ice cover in the Arctic Ocean and altered bird migratory patterns, scientific evidence for climate change built up in 2005 and non-scientists seem to have listened. Cell Signaling Steps Up: Dynamic views of how cells respond to the chemical and environmental signals all around them took hold in 2005 thanks to efforts to track multiple inputs and outputs of cell signaling networks simultaneously. For example, researchers created a model of nearly 8,000 chemical signals involved in a network leading to programmed cell death.
ITER Lands in France: The struggle over the location of the worlds first fusion reactor has ended -- the International Thermonuclear Experimental Reactor (ITER) will be built at Cadarache in southern France and not in Rokkasho, Japan. One aim of ITER is to generate fusion-powered electricity by recreating the power of the sun on Earth.
Sciences Breakdown of the Year -- U.S. Particle Physics: With the cancellation of two major experiments and talk of an early closing for one of the three existing particle colliders, U.S. particle physics is Sciences breakdown of 2005. As the U.S. program founders, particle physics research around the world could suffer. A bit of good particle physics news did emerge in 2005, however – researchers around the world remain committed to building the International Linear Collider, a multibillion-dollar global facility that may be the key to the future of particle physics. Areas to watch in 2006: This year, Sciences predictions for hot fields and topics in the upcoming year include drug and vaccine development for avian flu, RNA-interference in humans, high-temperature superconductors, the microbial family tree, detection of the merging of two neutron stars and ultrahigh-energy cosmic rays – the speediest atomic nuclei in the universe. Researchers will also be on the lookout for more evidence for the ivory-billed woodpecker and solid helium flowing like a liquid.
Natasha Pinol | EurekAlert!
http://www.sciencemag.org
http://www.aaas.org | 科技 |
2017-09/1580/en_head.json.gz/1133 | The American Naturalist
Vol. 94, No. 875, Mar. - Apr., 1960
The Evolution of Sta...
The Evolution of Stability in Marine Environments Natural Selection at the Level of the Ecosystem
M. J. Dunbar
Vol. 94, No. 875 (Mar. - Apr., 1960), pp. 129-136
Published by: The University of Chicago Press for The American Society of Naturalists
Species, Marine ecosystems, Evolution, Climate models, Animals, Breeding, Marine ecology, Ecosystems, Seas, Population ecology
Description: Current issues are now on the Chicago Journals website. Read the latest issue.Since its inception in 1867, The American Naturalist has maintained its position as one of the world"s premier peer-reviewed publications in ecology, evolution, and behavior research. Its goals are to publish articles that are of broad interest to the readership, pose new and significant problems, introduce novel subjects, develop conceptual unification, and change the way people think. AmNat emphasizes sophisticated methodologies and innovative theoretical syntheses—all in an effort to advance the knowledge of organic evolution and other broad biological principles.
Starting from the premise that oscillations are dangerous for any system and that violent oscillations may be lethal, this paper contrasts the highly stable production systems of tropical waters with the seasonal and longer-term oscillations of temperate and polar waters. The differences are climatically determined, and since the present glacial type of climate is young in the climatic history of the earth, the ecological systems of the higher latitudes are considered as immature and at a low level of adaptation. That they may be in process of evolution toward greater stability is suggested by a number of phenomena, such as the development of large, slow-respiring, slow-growing individuals, and the production of the young in many arctic invertebrates in mid-winter or late fall. These and other observed peculiarities of high latitude fauna tend to make the most efficient use of the available plant food and to spread the cropping pressure over as much of the year as possible. Oceanic birds are cited as examples in which stable populations have been achieved by evolution of lower breeding rates, and the phosphate and nitrate cycles in the upper layers of tropical seas are discussed. It is emphasized that selection here is operating at the level of the ecosystem; competition is between systems rather than between individuals or specific populations.
The American Naturalist © 1960 The University of Chicago Press | 科技 |
2017-09/1580/en_head.json.gz/1229 | Station Directory Sponsor Sponsor Dead loved ones' voices fall victim to technology Issues Oct 12, 2013
Lisa Moore Michael Conroy/AP By TOM COYNE
Associated Press When her 19-year-old daughter died of injuries sustained in a Mother's Day car crash five years ago, Lisa Moore sought comfort from the teenager's cellphone.
She would call daughter Alexis' phone number to listen to her greeting. Sometimes she'd leave a message, telling her daughter how much she loved her. "Just because I got to hear her voice, I'm thinking `I heard her.' It was like we had a conversation. That sounds crazy. It was like we had a conversation and I was OK," the Terre Haute, Ind., resident said. Moore and her husband, Tom, have spent $1,700 over the past five years to keep their daughter's cellphone service so they could preserve her voice. But now they're grieving again because the voice that provided solace has been silenced as part of a Sprint upgrade. "I just relived this all over again because this part of me was just ripped out again. It's gone. Just like I'll never ever see her again, I'll never ever hear her voice on the telephone again," said Lisa Moore, who discovered the deletion when she called the number after dreaming her daughter was alive in a hospital.
Technology has given families like the Moores a way to hear their loved ones' voices long after they've passed, providing them some solace during the grieving process. But like they and so many others have suddenly learned, the voices aren't saved forever. Many people have discovered the voices unwittingly erased as part of a routine service upgrade to voice mail services. Often, the shock comes suddenly: One day they dial in, and the voice is inexplicably gone.
A Sprint upgrade cost Angela Rivera a treasured voice mail greeting from her husband, Maj. Eduardo Caraveo, one of 13 people killed during the Fort Hood shootings in Texas in 2009. She said she had paid to keep the phone so she could continue to hear her husband's voice and so her son, John Paul, who was 2 at the time of the shooting, could someday know his father's voice.
"Now he will never hear his dad's voice," she said.
Jennifer Colandrea of Beacon, N.Y., complained to the Federal Communication Commission after she lost more than a half dozen voice mails from her dead mother while inquiring about a change to her Verizon plan. Those included a message congratulating her daughter on giving birth to a baby girl and some funny messages she had saved for more than four years for sentimental reasons. "She did not like being videotaped. She did not like being photographed," Colandrea said of her mother. "I have very little to hold onto.
"My daughter will never hear her voice now." Transferring voice mails from cellphones to computers can be done but is often a complicated process that requires special software or more advanced computer skills. People often assume the voice mail lives on the phone when in fact it lives in the carrier's server. Verizon Wireless spokesman Paul Macchia said the company has a deal with CBW Productions that allows customers to save greetings or voice mails to CD, cassette, or MP3. Many of those who've lost access to loved ones' greetings never tried to transfer the messages because they were assured they would continue to exist so long as the accounts were current. Others have fallen victim to carrier policies that delete messages after 30 days unless they're saved again. That's what happened to Rob Lohry of Marysville, Wash., who saved a message from his mother, Patricia, in the summer of 2010. She died of cancer four months after leaving a message asking him to pay a weekend visit to her in Portland, Ore. "I saved it. I'm not sure why I did, because I typically don't save messages," Lohry said.
The message was the only recording Lohry had of his mother's voice because the family never had a video camera when he was growing up. He called the line regularly for a year because he found it reassuring to hear her voice. But he called less often as time passed, not realizing that T-Mobile USA would erase it if he failed to re-save the message every 30 days. "I always thought, `At least I know it's there,"' he said. "Now I have nothing. I have pictures. But it's something where the age we live in we should be able to save a quick five-second message in a voice mail." Dr. Holly Prigerson, director of the Dana-Farber Cancer Institute's Center for Psychosocial Epidemiology and Outcomes Research and a professor of psychiatry at Harvard Medical School who has studied grief, said voice recordings can help people deal with their losses.
"The main issue of grief and bereavement is this thing that you love you lost a connection to," she said. "You can't have that connection with someone you love. You pine and crave it," she said. Losing the voice recording can cause feelings of grief to resurface, she said.
"It's like ripping open that psychological wound again emotionally by feeling that the loss is fresh and still hurts," Prigerson said. But technology is devoid of human emotion. In the Moores' case, Sprint spokeswoman Roni Singleton said the company began notifying customers in October 2012 that it would be moving voice mail users to a different platform. People would hear a recorded message when they accessed their voice mail telling them of the move. Sprint sent another message after the change took effect.
No one in the Moore family got the message because Alexis' damaged phone was stored in a safe.
Singleton said the company tried to make sure all of its employees understand the details of its services and policies, "but mistakes sometimes happen. We regret if any customers have been misinformed about the upgrade," she said.
Lisa Moore finds it hard to believe Sprint can't recover the message.
"I can't believe in this day and age there's nothing they can do for me," she said. Gallery Lisa Moore Michael Conroy/AP Stay Informed The news on your schedule from MPR News Update Email Address* Zip Code MPR News Update AM Edition MPR News Update PM Edition See our Privacy Policy. Must be age 13. Previous
Mount Rushmore to reopen amid federal shutdown
Federal budget deficit Q & A | 科技 |
2017-09/1580/en_head.json.gz/1233 | Why has DIY gear become so popular?
By Computer Music
Tech Click-together synths and controllers, 3D printing and crowdfunding discussed
Shares Would you rather spend your time building gear or making music?
Doing "a spot of DIY" at the weekend is as English a tradition as a morning cup of tea or a chat about the weather, but for most people, it usually consists of putting down some new gravel in the garden or re-doing the grout on the bathroom tiles. In high-tech music-making circles, however, the term 'DIY' has recently taken on rather more exciting connotations. Recent months have seen a slew of products being launched by small companies and individuals who've taken it upon themselves to bring their creations to market. And in a nice piece of symmetry, many of these products are designed to help users come up with their own controllers, synths or other devices.Of course, it's been possible for pretty much anyone to have a go at building their own music software creations for a long time (via platforms such as Max, Reaktor, SynthMaker and SynthEdit), but as far as hardware goes, the game definitely seems to have changed.So what's going on? Why is everyone suddenly looking to build their own gear - whether they're handy with a soldering iron or not - and what effect might this DIY boom have on your music making?If you're a computer musician, arguably the most notable area to be hit by the DIY juggernaut is that of the MIDI controller. We're seeing homebrewed designs leaking onto the internet on what seems like a daily basis, so you may have started wondering if it's worth taking the plunge and coming up with your own."Why is everyone suddenly looking to build their own gear - whether they're handy with a soldering iron or not - and what effect might this DIY boom have on your music making?" Given how many off-the-shelf controllers are available, however, is there really anything to be gained by doing this? We spoke to Dave Cross, Head of Research & Development for Dubspot, Inc, and a custom controller expert.Just for you"The key benefit is choice," he tells us. "Off-the- shelf products must, by their very nature, cater to multiple usage scenarios. It's inherent to the economics of manufacturing. But a device that caters to multiple users' wants cannot cater to a single user's needs without that person also making concessions to their workflow. So 'choice' actually represents pain towards the status quo: custom is a reasonable option if your frustration with the existing market of MIDI controllers is extraordinary."Note the word "extraordinary" here - unless you're really unhappy with what's on offer, a custom controller might not be worth the effort and expense. But if you don't have any previous electronics experience, how easy - or difficult - is it to go about designing and building one?"It has gotten quite easy to wire up a knob or a slider to a circuit board and have it send MIDI," explains Dave Cross. "At the end of the day, it's just a bit of wiring. What remains difficult are the non-electronics challenges of designing enclosures, creating faceplates and sourcing parts. This second layer of detail is what turns a jumble of wires into an instrument."Perhaps in recognition of this last point - that a load of wires and components does not a functional item make - the last few months have seen a boom in products based on hardware modules that can be clipped together. We've seen littleBits and Korg's Synth Kit, which enables you to build your own analogue synth; the similarly themed Patchworks system, which should be available soon; and Palette, which offers a way for you to buy a MIDI controller in kit form (as a box full of individual knobs, faders and buttons) and then stick it together as you wish. Solutions like this, somewhere between building your own gear and buying it ready- made, seem to have struck a chord with consumers. Why are they so popular?"The products you mention combine two themes: our insatiable fascination with building things, plus technology that turns a physical connection into a specialised electrical connection," reckons Dave Cross. "This is a recipe for creating 'quick wins' - a framework that allows someone to succeed with only a novice understanding of the system. Children's museum exhibits employ this technique extensively; it's a genius way to teach and to encourage creativity. That said, these products struggle with their toy-like perception."This is an interesting point; although these Lego-style platforms are proving popular with consumers, it's hard to know whether people are attracted to them because they think they're going to be able to build something genuinely useful with them, or just because they like the idea of going through the building process itself. A click-together controller might sound like a great idea, but is it really going to be something that you'll use in your studio, or will it turn out to be yet another distraction?A new dimensionThat said, it seems likely that modular platforms that ease the DIY process are going to become increasingly popular, not least because it's going to get easier and easier for people to create them. No longer can would-be manufacturers simply design products - thanks to 3D printing, they can create them, too."I would bet my first-born that home fabrication will significantly disrupt existing models of production and distribution," says Dave Cross. "But I can't be sure which major industry will be disrupted first because right now, the most well-known market for 3D printed designs is filled with novelty items. 3D printing is a revolution for tinkerers with 3D CAD expertise. It won't be a revolution for the masses until there's a library of useful 3D files to print from."True enough, but if 3D printing is one buzzphrase that's associated with small-scale hardware production, crowdfunding is most certainly the other. When a DIY project is launched, it almost feels inevitable now that it'll have a Kickstarter (or similar) campaign behind it, so are these platforms making it easier for newcomers to get into the market?"The 3D printing and Kickstarter revolutions go hand-in-hand: both have the potential to disrupt the existing models of producing and distributing goods," believes Dave Cross. "3D Printing will do it by eliminating the need for quantity, and crowdfunding will do it by tipping the scales away from distributors and towards manufacturers."The 3D printing and Kickstarter revolutions go hand-in-hand: both have the potential to disrupt the existing models of producing and distributing goods." Dave Cross, Dubspot "Crowdfunding can solve a prickly chicken/ egg problem inherent to manufacturing. Your widget may cost £50, but it will cost £30,000 upfront to prepare the machines for your product. Crowdfunding that initial investment turns a major gamble into a surefire bet."That said, many manufacturers seem more than willing to take that gamble with their own funds after they've achieved initial success. In other words, how many companies have put their 'second' product up for crowdfunding? What does that say about the current value of crowdfunding to an established manufacturer?"It's an interesting point, but it does seem that a serendipitous set of circumstances has made it easier than ever for would-be manufacturers to bring their products to market. This has the potential to affect not just the types of products that we buy in the future, but also our ability to create things ourselves.
Get 69 FREE plugins plus this new MIDI scales app…
…all with the latest issue of Computer Music magazine | 科技 |
2017-09/1580/en_head.json.gz/1312 | Home / About OSA / Newsroom / News Releases OSA Past President Thomas M. Baer Testifies on Capitol Hill
OSA Past President Thomas M. Baer Testifies on Capitol Hill
House Subcommittee on Technology and Innovation examines ways to serve the biomedical community
WASHINGTON, Feb. 24—Testifying on Capitol Hill today, Optical Society (OSA) Past President Thomas M. Baer called for greater involvement by National Institute of Standards and Technology (NIST) in supporting the healthcare industry through developing standards and expanding its own ongoing research efforts in bioscience and healthcare. Baer testified today before the House Science and Technology Subcommittee on Technology and Innovation on ways in which NIST could better serve the needs of the biomedical community. The hearing, How Can NIST Better Serve the Needs of the Biomedical Research Community in the 21st Century?, is one in a series designed to look at potentially restructuring science and technology agencies for the purposes of when the Committee reauthorizes the America COMPETES Act. Specifically, the subcommittee is examining how biomedical research at NIST could be structured to achieve goals, such as increasing NIST’s technical expertise and outreach efforts through collaborations with academic institutions, private industry and nonprofits. Additionally, they are looking at ways to develop mechanisms that allow for NIST to obtain effective and targeted input and feedback from industry, academia and nonprofits.
“Translating the tremendous advances in quantitative biology instrumentation into effective diagnostic tests will require developing standard reference materials, reproducible consensus protocols, and understanding the basic measurement science underlying these new quantitative biomedical instruments,” Baer stated in his testimony today. “Much of this work has yet to be done and lack of this standards framework is impeding the translation of these new technologies into medical practice, affecting the lives of many critically ill U.S. citizens who could benefit from accelerated introduction of these breakthrough technologies. NIST can play a pivotal role in accelerating deployment of these remarkable new instruments and procedures.”
Baer went on to describe four ways in which NIST could provide the greatest service to the biomedical community, which include:
In particular, developing standards, consistent protocols, and advancing measurement science in applying quantitative molecular analysis technology to diagnostic tests
Supporting the application of the newest generation of quantitative imaging instruments (e.g. CT, MRI, ultrasound)
Working with the drug development industry to accelerate the drug development process
Improving understanding of the technology needed to perform the measurements necessary to provide accurate assessment of the safety and efficacy of new drugs
Working with universities and private industry to development methods for new classes of therapy enabled by advances in stem cell science, with applications in diseases such as diabetes and organ replacement
Providing a sound basis for measurement science in the area of neuroscience and neuromedicine, with applications in Parkinson’s disease and Alzheimer’s disease
Today’s hearing is a follow-up to the hearing held Sept. 24, 2009 titled: The Need for Measurement Standards to Facilitate Research and Development of Biologic Drugs. During the September 2009 hearing the Committee examined the need to develop measurements, reference materials, reference standards, standard processes, and validation procedures to improve the research, development and regulatory approval of biologics. Industry experts and the FDA expressed that there is a need for NIST to perform basic measurement science research to support the growth of the biologics industry.
Baer is the executive director of the Stanford Photonics Research Center and served as OSA President in 2009. His involvement with NIST includes his current position as a member of the NIST Visiting Committee for Advanced Technology, and a previous six-year term on the National Research Council review panels for both the Physics and Chemical Science and Technology Laboratories. Throughout his extensive career in the fields of lasers and optics and photonics, Baer has co-authored numerous scientific publications in the areas of atomic physics, quantum electronics, laser applications and biotechnology.
Also testifying in today’s hearing were Sharon F. Terry, president and CEO of Genetic Alliance and Daniel Sullivan, professor and vice chair for research in radiology at Duke University Medical Center and science advisor to the Radiologic Society of North America. For more information on the hearing, including full testimonies, visit the Science and Technology Committee’s Web site.
Uniting more than 106,000 professionals from 134 countries, the Optical Society (OSA) brings together the global optics community through its programs and initiatives. Since 1916 OSA has worked to advance the common interests of the field, providing educational resources to the scientists, engineers and business leaders who work in the field by promoting the science of light and the advanced technologies made possible by optics and photonics. OSA publications, events, technical groups and programs foster optics knowledge and scientific collaboration among all those with an interest in optics and photonics. For more information, visit www.osa.org. | 科技 |
2017-09/1580/en_head.json.gz/1335 | Panasonic to launch tender offer for Sanyo
Panasonic plans to launch a tender offer for shares of Sanyo Electric with the hope of acquiring a majority stake in the smaller and struggling rival.
Martyn Williams (IDG News Service) on 22 December, 2008 08:23
Japan's largest consumer electronics company, Panasonic, plans to launch a tender offer for shares of Sanyo Electric with the hope of acquiring a majority stake in its smaller rival, the company said.The offer, which has been anticipated for several weeks, was finalized at board meetings of the two companies on Friday and could see Sanyo become a unit of Panasonic within the first quarter of 2009.Panasonic sees several benefits in acquiring Sanyo. The company is the world's biggest manufacturer of lithium-ion batteries and an innovator in green-energy products such as solar cells, which are both business areas Panasonic is keen to get into. Panasonic has a sizeable battery business and sees a combination of the two companies as key to growing in the emerging hybrid electric vehicle market and electric vehicle market. Panasonic also thinks it can grow Sanyo's solar business.The two companies are also linked in history. When Konosuke Matsushita started Matsushita Electric Devices Manufacturing Co., the forerunner to today's Panasonic, in 1918 to make electric plug adapters for light sockets he took on a small number of staff including brother-in-law Toshio Iue. In 1947 Iue would go on to start Sanyo Electric. Today the two companies remain based in Osaka and their headquarters are only a short drive apart.The acquisition, which is unusual in Japan's conservative business world, has been the subject of discussions between Panasonic and Sanyo's three biggest shareholders for the last few weeks.
Japanese financial groups Sumitomo Mitsui and Daiwa Securities and U.S.-based Goldman Sachs bought roughly ¥300 billion (US$3 billion) of preferred shares in Sanyo in March 2006 as part of the company's restructuring efforts. Collectively the shares, if converted to common stock, would represent around 70 percent of Sanyo's outstanding shares. The shares can however only be sold with Sanyo's permission but that part of the contract will expire in March 2009 thus the pressure has been on to work out a deal before then.Panasonic originally offered ¥120 per share but that was rejected by Goldman Sachs as too low, according to local press reports. Subsequent talks saw the price raised to ¥130 and again to ¥131 -- the price at which the parties settled.To pay for the acquisition Panasonic said it would offer up to ¥40 billion in bonds sometime after January 2009.
Tags Panasonicsanyo | 科技 |
2017-09/1580/en_head.json.gz/1336 | Microsoft to host e-mail for 120,000 USDA workers
The workers, who will also get SharePoint, IM and webconferencing, will migrate from 21 different e-mail systems
Nancy Gohring (IDG News Service) on 09 December, 2010 07:28
As many as 120,000 federal workers at the U.S. Department of Agriculture will soon start using hosted e-mail and other applications provided by Microsoft as part of a deal announced Wednesday.Within the next four weeks, the USDA expects to start moving employees, who are currently using 21 different e-mail systems, to the new hosted service, it said. In addition to e-mail, the workers will have access to hosted SharePoint, Office Communications for instant messaging and Live Meeting for webconferencing.Microsoft will also manage the software and networks for the USDA.The agreement is a big win for Microsoft, which has had to face a new competitor for big government deals: Google. Earlier this month the U.S. General Services Administration said that Google would offer it hosted e-mail services. Google said that was the first federal agency to use a hosted e-mail service.Google also has contracts to offer Google Apps to government workers in Washington, D.C., and Orlando. It won a high-profile contract with the city of Los Angeles, which has been delayed.
Microsoft has also secured some big deals to provide hosted services to government agencies. The city of New York and the states of California and Minnesota have chosen Microsoft for cloud-based e-mail.Both companies have now received certification under the Federal Information Security Management Act, a stringent security standard that some federal agencies are required to comply with. Google achieved the certification in July, while Microsoft announced its certification last week.Nancy Gohring covers mobile phones and cloud computing for The IDG News Service. Follow Nancy on Twitter at @idgnancy. Nancy's e-mail address is Nancy_Gohring@idg.com
Tags servicesapplicationsMicrosofte-mailHostedsoftwarecollaboration
Nancy Gohring | 科技 |
2017-09/1580/en_head.json.gz/1379 | Peace River, AB
Weather Sponsorship available! Home News Sports Entertainment Life Money Opinion UR Marketplace Newsstand Photos Videos Polls Weather Events Playoff Pool More
U of A astronomer helps discover ultraluminous X-ray bursts in space By Stuart Thomson
12:45:57 MDT PM
Ultraluminous X-ray burst image captured by the Chandra X-ray Observatory. Photo credit: Harvard-Smithsonian Center for Astrophysics.
A University of Alberta astronomer may have helped discover an entirely new class of explosive event in space, as part of a global research team.
The still-unknown objects erupt with X-rays, spontaneously flaring up to 100 times brighter than normal and then returning to their original X-ray levels. The flare-ups happen in less than a minute and return to normal after about an hour.
X-rays like this are generated from extremely hot temperatures, up to the equivalent of “millions of degrees celsius,” said Gregory Sivakoff, a U of A researcher who co-authored the study that will appear in Nature.
The team has come up with a number of explanations for the source of the flares, with the most likely being that they are caused by matter from a nearby star falling into black holes or neutron stars. Black holes and neutron stars are extremely compact remnants of stars and a neutron star contains the entire mass of a star in an area the size of Edmonton, said Sivakoff. A black hole is even smaller.
When the material from the star rubs into a disc around the neutron star or black hole, it gets extremely hot and then falls, creating the X-rays. Sivakoff said a disruption in the system may be causing a large amount of material to fall at once, which causes a rush of X-rays and creates the massive flare-ups.
Due to the extreme brightness of the eruption, the team has named them “ultraluminous X-ray bursts.” The researchers say these type of flares have been important to astronomy in the past and they hope this will be the case again. For example, some supernovas were essential for understanding how the universe is expanding and how material that created life spread.
“To our knowledge, this behaviour is new, unexplored phenomena,” said Sivakoff.
The study had 11 authors from 10 international institutions, including the University of Alabama. The discovery was made using NASA’s Chandra X-ray observatory and the European Space Agency’s XMM-Newton observatory. The two telescopes are orbiting in space and each organization accepts proposals from researchers who are interested in using them.
sxthomson@postmedia.com
twitter.com/stuartxthomson
Start your day with Peace River Record Gazette
Follow Peace River Record Gazette
Peace River Record Gazette
© 2017 Peace River Record Gazette. All rights reserved. A member of Sun Media Community Newspapers part of Postmedia Network. | 科技 |
2017-09/1580/en_head.json.gz/1537 | W boson squeezes Higgs particle
The W boson will help to corner the elusive Higgs particle. And photographer Robert Tilden has a photo to prove it.
The W boson is squeezing the Higgs boson: plush toys by artist Julie Peasley, photo by Robert Tilden. The DZero collaboration at the Department of Energy's Fermilab has achieved the world's most precise measurement of the mass of the W boson by a single experiment. Combined with other measurements, the reduced uncertainty of the W boson mass will lead to stricter bounds on the mass of the elusive Higgs boson.
This inspired Tilden, a software engineer at Northwestern University and photographer, to create a photo of the W boson squeezing the Higgs boson in a vise. Artist Julie Peasley created the plush toys representing the particles, and they are available for sale on http://www.particlezoo.net/
The W boson is a carrier of the weak nuclear force and a key element of the Standard Model of elementary particles and forces. The particle, which is about 85 times heavier than a proton, enables radioactive beta decay and makes the sun shine. The Standard Model also predicts the existence of the Higgs boson, the origin of mass for all elementary particles.
Precision measurements of the W mass provide a window on the Higgs boson and perhaps other not-yet-observed particles. The exact value of the W mass is crucial for calculations that allow scientists to estimate the likely mass of the Higgs boson by studying its subtle quantum effects on the W boson and the top quark, an elementary particle that was discovered at Fermilab in 1995.
Scientists working on the DZero experiment now have measured the mass of the W boson with a precision of 0.05 percent. They announced their result in a Fermilab press release today. The exact mass of the particle measured by DZero is 80.401 +/- 0.044 GeV/c2. The collaboration presented its result at the annual conference on Electroweak Interactions and Unified Theories known as Rencontres de Moriond last Sunday.
"This beautiful measurement illustrates the power of the Tevatron as a precision instrument and means that the stress test we have ordered for the Standard Model becomes more stressful and more revealing," said Fermilab theorist Chris Quigg.
The DZero team determined the W mass by measuring the decay of W bosons to electrons and electron neutrinos. Performing the measurement required calibrating the DZero particle detector with an accuracy around three hundredths of one percent, an arduous task that required several years of effort from a team of scientists including students.
Since its discovery at the European laboratory CERN in 1983, many experiments at Fermilab and CERN have measured the mass of the W boson with steadily increasing precision. Now DZero achieved the best precision by the painstaking analysis of a large data sample delivered by the Tevatron particle collider at Fermilab. The consistency of the DZero result with previous results speaks to the validity of the different calibration and analysis techniques used.
"This is one of the most challenging precision measurements at the Tevatron," said DZero co-spokesperson Dmitri Denisov, Fermilab. "It took many years of efforts from our collaboration to build the 5500-ton detector, collect and reconstruct the data and then perform the complex analysis to improve our knowledge of this fundamental parameter of the Standard Model."
The W mass measurement is another major result obtained by the DZero experiments this month. Less than a week ago, the DZero collaboration submitted a paper on the discovery of single top quark production at the Tevatron collider. In the last year, the collaboration has published 46 scientific papers based on measurements made with the DZero particle detector. About 550 physicists from 90 institutions in 18 countries work on the DZero experiment. Follow | 科技 |
2017-09/1580/en_head.json.gz/1540 | Women main users of app on screening depression: group
EMOTIONAL HEALTH::Men who are under social pressure to conceal their emotions were advised to also regularly use the self-assessing application
By Alison Hsiao / Staff reporter
Seventy percent of the users of a depression screening application developed by the John Tung Foundation and the Social Entertainment Enterprise are women, the foundation said, urging men to also regularly monitor their emotional health.About 820,000 people have taken online or paper-based depression self-assessments every year since the foundation started the Depression Screening Day project in 2000, it said.Citing a study published in the American Journal of Orthopsychiatry in 2011, the foundation said that depression screening at university health centers can help medical providers play a bigger role in early depression identification and intervention for students.The foundation and its partner enterprise developed an iOS version of the app called “DS Depression Screening,” which was downloaded 40,000 times in the five months since its launch in November.Yeh Ya-hsin (葉雅馨), head of the foundation’s mental health section, said that more than half of the app users fall in the 20-to-29 age group, which corresponds with the age distribution of smartphone users, while only 2.32 percent are aged 50 and over.“In terms of gender, 70 percent of the depression screening app users are female,” Yeh said.“Taiwanese men are under social pressure to conceal their emotions. By using the self-assessing application, they can take a deeper look into their mental world,” Yeh said.Following the success of the iOS version, the foundation and the Social Entertainment Enterprise said they would release an Android version with an improved feature on after-screening advice later this month. | 科技 |
2017-09/1580/en_head.json.gz/1650 | Econotes Archive 2008
2003: Jan-Jun Jul-Dec
2001: Jan-Jun Jul-Dec 2000 Photo 27 April 2003 by Gary Fewless
Lake Sturgeon (Acipenser fulvescens) Spawning in the Wolf River near Shiocton, WI Lake Sturgeon (Acipenser fulvescens) are the largest fish species endemic to the Great Lakes Basin. There are 24 species of Sturgeon (family Acipenseridae) in the world, 5 of which occur in North America. Sturgeon species as we know them now have remained relatively unchanged from their ancestors of 125,000,000 years ago. They lack scales like more modern fishes and instead are covered with five rows of bony plates sometimes called scutes. They are slow maturing and long-lived. Females do not spawn until they are at least 14 years old, but usually not until they are 20 years old. Males matures earlier, between 8 and 17 years of age. Like many fishes they continue to grow as they age, eventually weighing about 80 lbs, but the Wisconsin record catch was a fish weighin 170 lbs. Lake Sturgeon have average life spans of 80 years for females and 55 years for males, but individuals over 150 years have been caught. They usually weigh between 10 and 80 lbs. They feed by siphoning small animals including insect larvae, crayfish, snails, clams, and leeches
Spawning occurs during April and May when water levels and temperatures are favorable. The fish migrate to clean shallow rocky reefs in larger rivers or along islands. The fish will often perform "staging" displays where they roll on their sides and jump out of the water. Females only lay eggs every 4 or 5 years so that in any given year only 15% of the population is reproducing. Each female will lay 100,000-800,000 eggs in shallow rocky river bottoms during the spawning season, but most of these will be eaten by predators, like crayfish and carp. Highly prized in Wisconsin in the late 1800s for their eggs, as well as for gelatin, oil, and glue, the population was fished to 10% of its original size. In 1903 the first size limit restrictions were imposed and in 1915 Wisconsin began the first sturgeon fishery management program when it closed the fishery to allow the population to recover. In 1931 a regulated spearing fishery opened and a scientifically regulated fishery based on population estimates and spawning estimates has been in place since the 1940s. Sturgeon are under continuous threat from dams, pollution and habitat degradation, as well as from fishing. Because sturgeon are bottom feeders and are at risk from pollutants that accumulate in the sediments that they feed from. They are also at risk during spawning if there is not enough suitable habitat for eggs to survive and hatch. Today, the sturgeon spearfishery is highly regulated so that only a small proportion of reproductive adults are removed from the population each year. Lake Sturgeon Spawning Viewing Sites (from WI DNR):
Wolf River - Over 50 places to watch along river. Fox River - Over 50 places to watch along river. Pfeifer Park, New London - Take Hwy. 45 north and turn right onto Waupaca St. Follow Waupaca St. to Embarrass Dr. and turn left. Walk the south shoreline of the Embarrass River. Mukwa Wildlife Area, New London. Take County road X west from New London for 2 miles. Park in the Mukwa Wildlife Area parking lot and walk along the south shoreline of the Wolf River. Shawano Dam, Shawano. Take Hwy. 29 west through Shawano to the bridge crossing the Wolf River. Shiocton, "Bamboo Bend". Take Hwy. 54 near "Bamboo Bend" where Old Hwy. 54 crosses the Wolf River. Overpasses and riverbanks allow public viewing. (Our photos were taken at Shiocton)
© 2001-2004 The Cofrin Center for Biodiversity and the University of Wisconsin Green Bay, All Rights Reserved | 科技 |
2017-09/1580/en_head.json.gz/1690 | The Advent of Airborne Wind Power
Airborne wind is on track to become a cost effective, practical, and utility scale-ready segment of the industry. National Instruments describes its role.
By: Brian MacCleery
You might be an expert on conventional horizontal and vertical wind turbines, but have you heard of airborne wind? If the pace of innovation in the nascent airborne wind industry is any guide, 10 years from now “airborne wind turbines” could become a household word. Why? In most of the Northern hemisphere, just a few thousand feet above our heads blows a vast untapped resource on par with some of the best ground-based wind sites in the world. Go 10 times higher into the troposphere and you’ll find the highest density source of renewable energy in the world.Harnessing high altitude airborne wind may take some major leaps, but closer to Earth airborne wind is on track to become a cost effective, practical, and utility scale-ready segment of the wind industry within the decade. Most airborne wind companies have their sights set on the “boundary layer” winds that blow a few thousand feet above ground level. Bringing utility scale airborne wind to market at those altitudes doesn’t require any breakthroughs, just solid engineering work, R&D investment, and the support and guidance of the experienced ground-based wind community.At least 30 startups and research groups around the world are busy at work to make airborne wind a reality. An abundance of commercial off the shelf (COTS) technologies and tools are enabling them to achieve a remarkable pace of innovation. Over the years their prototypes have proven the basic principles of airborne wind and grown into the tens of kilowatts. The next step for the industry leaders is to prove their systems can perform reliably during long-term continuous operation in the field.Airborne wind is in its infancy, but if it makes it off the ground it would help extend the reach of the wind industry to new locations where ground-based turbines aren’t cost effective today. Makani Power, a well-funded leader in airborne wind, believes its Airborne Wind Turbine (AWT) technology can extend the developable terrestrial wind resource area by five times, to over 80 percent of the U.S. land surface. Paired with ground-based turbines, airborne wind promises to help keep the power lines humming by reducing the variability of production, and by “going vertical” to extract more energy from a given land area. Innovation through AssimilationAirborne wind borrows many established technologies from the rest of the wind energy industry, sometimes even using the same type of generators, gearboxes, and grid-tied power converters. Table 1 compares similarities and differences between ground-based and airborne turbines. The main feature that makes airborne turbines different is the way they extract energy from the wind. Instead of a large steel tower structure, a tether cable anchors the system to the ground. Rather than rotating blades, specially designed airfoils sweep a path across the sky to extract energy.This ability to sweep through a larger cross section of the sky is one of the fundamental attractions of airborne wind, enabling a modestly sized airfoil to extract large amounts of energy from the stronger, more-consistent winds found higher above ground. Like the tip of a conventional turbine blade, the airfoil flies crosswind in a circle or figure-eight pattern at many times the speed of the wind, as shown in Figure 1.Airborne and ground-based turbines operate on much the same aerodynamic principals. Just like conventional wind energy, power production is proportional to one-half the air density times the cube of wind speed (Equation 1). Thus, a small increase in wind speed makes a big difference in power, since doubling wind speed yields eight times more power. Like the tip of a conventional turbine blade the airfoil wing travels at high speeds through the air, using aerodynamic lift to efficiently extract energy. At a wingspan comparable in length to a wind turbine blade, an airborne turbine can sweep a larger region of the sky to access nearly 10 times more energy. Airborne turbines can also move up or down in altitude and adjust their flight path to adjust for a wide range of wind conditions. Mechanically, airborne turbines benefit from being cushioned in a pillow of air during flight rather than being rigidly connected to the ground—however, the g-force loads caused by their fast moving patterns can put significant stress on airfoil structures and tether lines. Operating at AltitudeBy going up above the reach of ground-based turbines, airborne machines chase the consistent and stronger wind resource at altitude. At 2,000 feet wind speeds above 8 m/s are blowing more than 40 percent of the time at most locations in the northern hemisphere. Furthermore, power densities (kW/m2) are on par with the world’s most favorable sites for ground-based wind, as illustrated in Figure 2. Thus, airborne turbines should expand the reach of the wind industry into new regions and could be located closer to population centers.Although boundary layer winds provide the “low hanging fruit” being chased by most companies today, the most elusive prize is found in the jet stream winds of the troposphere. At 35,000 feet, average power densities soar beyond 20 kW/m2 and the total available resource is measured in thousands of terawatts (TW), hundreds of times higher than world energy demand. “From an environmental perspective, getting huge terawatt-scale renewable systems is really important,” according to Ken Caldeira of the Carnegie Institution of Washington at the Airborne Wind Energy Conference 2010. “The idea that airborne wind power is of the scale to meet civilization’s needs is sound. Airborne wind energy is one of the few civilization scale power generation technologies.”Caldeira is a world-renowned climate scientist who has studied the potential environmental impact of extracting civilization scale power levels from high altitude wind. His models indicate that extracting 18 TW, enough energy to satisfy world energy demand, would itself have no significant impact on climate, resulting in a negligible cooling effect of 0.04 oC. Of course, switching the world to cheap, clean, carbon-free power would certainly have a positive impact on the environment.Harnessing high-altitude wind is a bold vision but brings with it a wide range of technical and logistical challenges, from finding tether lines that are strong and light enough to gaining Federal Aviation Administration (FAA) approval and airspace clearance. Even at boundary layer altitudes, FAA permitting questions need to be resolved. At least for now, making tropospheric wind commercially viable is likely to remain elusive. Even companies with their sights set on the troposphere plan to start at more modest heights. “At 1,000 feet wind is profitable on 70 percent of the world,” says Pierre Rivard, CEO of MAGENN Power, a company developing lighter than air generators resembling blimps that also double as floating cell phone towers. Exploring the Design SpaceLooking top down, there are three primary ways to classify airborne wind systems: 1) altitude: low, boundary layer or troposphere; 2) type of airfoil: lighter or heavier than air, rigid or flexible, drag, lift, or rotorcraft based, and; 3) generator location and type: ground or airborne, gearbox or direct drive.
Like the early days of the ground-based wind industry, researchers in airborne wind are testing every possible design choice to find out what works best. Even with computer models there’s no substitute for a physical prototype. Working prototypes are also going a long way toward convincing skeptics and attracting investors. “I think people were more skeptical a few years ago, but today there are many startup companies working on this,” says Archan Padmanabhan, director of business development at Joby Energy. “Over the past several years, the industry has seen many prototypes, and we’ve seen them growing larger into the tens of kilowatts. At Joby we started with ground-based generator designs, tested autogyro concepts, and finally moved to a winged airborne generation system with multiple rotors and power sent down the cable. Now we’re exploring ways in which the wing can be structured to simplify takeoff and landing.”Overcoming Technical ChallengesThe idea of using a “tethered airfoil” as a way to generate power isn’t new. The fundamental concepts were patented in the late 1970s and are now public domain. Lawrence Livermore National Lab researcher Miles L. Loyd pioneered the fundamental mathematical equations for airborne wind, developed computer models, and validated them with small experimental prototypes. His seminal paper, “Crosswind Kite Power,” was published in the June 1980 issue of the Journal of Energy.So why hasn’t airborne wind reached commercial viability? The answer has to do with complexity, cost, and Moore’s law. Just 10 years ago the processing, instrumentation, sensors, and control software were prohibitively expensive. Today exponential increases in embedded computing power (Moore’s law), instrumentation technology, and the availability of high level, high productivity software tools and rugged embedded computing platforms enables airborne wind companies to build and test their prototypes by the dozen, using readily available COTS technology and high level software tools to shrink the time between design, prototype, and field deployment.“Would airborne wind have been cost effective 10 years ago?” Padmanabhan asks. “No. It’s definitely advances in technology that make it cost effective today, from inexpensive aircraft materials to low-cost GPS sensors, autonomous flight software, and the increasing power of embedded computing. The biggest technical hurdle that’s been overcome is in the area of control systems. Thanks to the aerospace industry, flight control systems have become a lot more robust than at any time before. Commercial airlines today are primarily flown on autopilot and people trust those systems—no one expects airplanes to come crashing down. The aircraft industry has a lot to offer, and we are learning from it.”Windlift, another airborne wind startup, is developing mobile airborne wind turbines that have attracted significant interest from the U.S. military because their high-power density makes them a future replacement for diesel generators and the vulnerable fuel convoys that must supply them. Windlift uses the National Instruments’ LabVIEW graphical programming language and NI CompactRIO ruggedized embedded instrumentation systems for control and dynamic monitoring, as shown in the interface for their 12 kW prototype system (Figure 3). Control Design Engineer Matt Bennett explains how commercially available tools play an important role in their development. “Having a COTS real-time system is a big enabler,” he says. “Actively flying the airfoil under high load is a real challenge and the NI CompactRIO system takes care of all of the signal processing and feedback control tasks required to keep the system stable. We extensively use the field programmable gate array, or FPGA, which handles tasks completely in parallel. The LabVIEW FPGA technology is indispensible. There are a lot of things it does that we couldn’t do any other way.”Power Generation TechnologiesWe are all familiar with the spinning blades of ground-based turbines. So how do airborne turbines generate power? Although techniques vary widely—from tethered rotorcraft to lighter than air spinning blimps—the most mundane and popular techniques leverage one of two basic principles: 1) a ground-based generator attached to the tether cable winch which produces power as the kite pulls out the cable, or; 2) a set of high speed wind-driven propellers onboard the airfoil that drive small airborne generators. Table 2 compares the pros and cons of these popular approaches.Ground-based generator systems, like those being developed by Windlift in the U.S. and KITEnergy in Europe, produce power when the airfoil pulls a tether line. The torque and velocity of the tether cable produces electricity by spinning a generator that is attached to a rotating winch drum. As illustrated in Figure 4, there are two distinct modes of operation—the traction phase, and the recovery phase. In the traction phase the airfoil slowly pulls the tether line and electricity is produced until the maximum tether length or altitude is reached. Then the recovery phase begins, during which the airfoil is flown back while the tether cable is winched in. Recovery actually uses a small amount of power, as the generator becomes a motor drive to retract the cable. Then the process begins again.For steering, the airfoil wirelessly transmits GPS coordinates and roll, pitch, and yaw information from an inertial measurement unit (IMU) in the air to a kite steering unit (KSU) on the ground. KITEnergy uses the National Instruments PXI platform and LabVIEW Real-Time software as the ground control unit, which acquires and processes the sensor signals and executes advanced control algorithms to command the winch motor-generator and steer the kite. “Theoretical, numerical, and experimental results so far indicate that KITEnergy technology could provide large quantities of renewable energy, available practically everywhere, at lower cost than fossil energy,” according to KITEnergy founder Mario Milanese.Other companies, such as Joby Energy and Makani Power, are pursuing airborne generator designs. In this case a number of small propeller-driven generators located on the aircraft are used for power generation, and power is sent down the tether cable to the ground. Airborne generator systems are typically more like an aircraft and less like a kite, featuring an onboard computerized autopilot system and flight control surfaces to control roll, pitch, and yaw like elevators and ailerons. A great deal of engineering effort at airborne wind companies is focused on perfecting these flight control systems and making them robust to any sort of problem, from gusting winds to actuator and sensor failures. The Makani Power system is being designed so it can even disconnect from the tether and land autonomously if needed. Any control or aerospace engineer looking for a fascinating challenge should consider working in the airborne wind industry. Figure 5 illustrates the salient features of the Joby Energy design, which is capable of vertical takeoff and landing and uses high-speed crosswind flight patterns to maximize energy output.Not surprisingly for such a nascent segment of the wind industry, the dust has yet to settle on which design choices prove to be the most practical and cost effective. It is way too early to call this race, but my guess is that the leading contender today is a boundary layer system with a rigid airfoil that’s carefully designed to maximize aerodynamic efficiency and an onboard flight control system. Figure 6 Airborne wind has a way to go before becoming a mature technology, but one thing is for sure—it’s an exciting time. Each new prototype that takes flight helps to convince skeptics and investors alike that above-ground wind power isn’t such a crazy idea. If you’re a professional in the wind industry, consider lending your talents to help airborne wind get off the ground. To learn more visit the Airborne Wind Energy Consortium Web site at www.aweconsortium.org. About The Author
Brian MacCleery is clean energy product manager at National Instruments. Go online to www.ni.com. | 科技 |
2017-09/1580/en_head.json.gz/1896 | Android DealsDeal: BK SPORT Bluetooth 4.0 Headphones for $17 – 2/22/17Pick Up A 3-Pack of AUKEY’s USB-C to USB-A Cables for $9.99 w/ Code – 2/21/17Pick Up a 256GB SanDisk Micro SD Card for $135 – 2/21/17Electronics Deals – Feb. 21st, 2017: Michael Kors, ASUS, Microsoft Recent NewsLG Survey: What Do People Want In Their Ideal SmartphoneRumor: Xiaomi Mi Max 2 To Sport 6GB Of RAM, Launch In MayGoogle Working On Ways To Show VR Users’ Faces In VideosPanasonic Toughpad FZ-F1, FZ-N1 & FZ-A2 Land In India Special FeaturesBest Chromebooks – February 2017Weekly Poll: Are You Signing Up For Unlimited Data With Verizon, T-Mobile Or AT&T?VR Weekly – Without More Content Is VR Headed For Failure?Best Smartwatches Buyers Guide – February 2017 Recent ReviewsMeizu M5 Note ReviewReview: OUKITEL C5 Pro Android SmartphoneReview: Honor 6X Android SmartphoneReview: OUKITEL U20 Plus Android Smartphone Top New Android Games Of The Week: July 3rd July 3, 2015 - Written By Justin Diaz
A number of new games launched this week onto the Play Store and if you’re looking for a few titles which might be able to help you kill some time during this 4th of July holiday weekend, check out the list below for some great new additions to the Android gaming scene.
The Silver Bullet is a top angle shooter with plenty of action. In the game, your character has to infiltrate a military research lab which has made a pact with the devil. As you progress you’ll come across a number of demons and enemy military combatants, and with your trusty dual pistols and silver bullets (hence the name) you’ll take them out one by one. The gameplay is basically a dual-stick shooter type, and there seems to be a fair bit of complexity to the storyline and the attention to detail in te level design. If you like games you can pay once for and be free of ads or IAP’s check out The Silver Bullet.
Socioball
Socioball is a puzzle game, so right off the bat you know it’s going to have some challenging moments. Each level tasks you with getting the little blue ball from one point to another on the board, and each time you’ll have to complete this task by following along a certain path. The trick is that tiles on the board will be missing, and you’ll have to insert the correct tiles in the right spots to get from start to finish. It’s a rather cool little game with over 60 puzzles so if you like games that make you think, give Socioball a look.
If you’re a MOBA game fan, you’ve likely already heard about Vainglory. It’s been a wildly popular hit on iOS, and now it’s finally launched on the Android platform so strap yourself in, steel your nerves and prepare for intense, chaotic team fights. The game is free to play like most other MOBA titles, and while there are IAPs you won’t ever have to spend a dime. As long as you play you’ll be able to unlock other characters. If you enjoy PvP combat games, Vainglory is definitely not to be missed.
Dead Among Us
Dead Among Us feels and plays a lot like the Contract Killer games from Glu, so if you enjoyed those then you’ll probably find enjoyment in Dead Among Us. In this game you’ll be armed with a bow and arrow and you’ll be taking out zombies and other undead, but it won’t be just a bunch of mindless shooting as there does seem to be some objective based tasks here and there, like defending companions while they attempt to pick up supplies. As you progress through the game you’ll switch to different vantage points to make sure you can keep an eye on the encroaching zombie horde.
This is the sequel to the original Sonic Dash which released last year. It has some pretty updated visuals to take advantage of the more powerful devices and you can even play as some new characters it looks like. It also lets you run with three characters at a time and SEGA has incorporated new challenges and new level backgrounds too. The only downside to this game right now is that SEGA has only soft launched in a number of regions, and sadly the U.S. is not one of them. For now there is no release date for the U.S. launch of the game, and we couldn’t find out the soft launch regions. If you’re interested in trying out Sonic Dash 2, check the link and see if it’s available in your area.
Google’s Chrome For Android Reaches A Billion InstallsThe LG G2 Will Be Updated To Android 5.1.1
| Android Games | 科技 |
2017-09/1580/en_head.json.gz/1948 | Skin-care industry skipping out on science?
BOSTON, Aug. 22, 2007 -- The multi-billion-dollar global cosmetics and skin-care-product industry sometimes is beset by a me-too mindset in which research and development focuses on matching the competition rather than applying sound science to improve products, a scientist told the 234th national meeting of the American Chemical Society.
As a result, it could be missing a golden opportunity to provide consumers with more effective products, according to a Stig E. Friberg, Ph.D. a chemist who studies cosmetic ingredients. As an example, Friberg points out that previously unknown changes occur in the structures of colloids used in skin care lotions. As a result, the lotion sitting in the bottle, he said, is actually different from the same lotion applied to the skin. Friberg has spent years in fundamental studies of the backbone of any lotion -- a mixture or "emulsion" of oil and water. Along with a third ingredient, a surfactant that keeps the liquids from separating, emulsions are the basis of almost every skin lotion. Although the system may sound simplistic, Friberg said it's not as straightforward as scientists once believed.
Friberg's work has revealed that after application, evaporation causes a lotion's internal structure to change, a fact that has not captured the attention of the skin-care industry. Initially in a liquid phase, the structure transforms while on the skin to a more orderly state, such as a liquid crystalline or solid amorphous phase, that allows for a higher tendency for molecules to enter the skin, he said. Previously, scientists have assumed the structure of an emulsion remains intact as lotions evaporate. But this isn't the case. "In fact, the appearance of liquid crystalline structures in the emulsion acts as if you have a much higher concentration of the active substance on the skin," said Friberg, who is with the University of Virginia. "Knowledge of the structure change will make the formulation of skin lotions more systematic." A main goal of the system is to find the best active ingredients for a given emulsion. In the land of lotion, these ingredients do the dirty work by penetrating the skin to protect or improve it. Well known active ingredients are salicylic acid used for complexion and camphor as an analgesic. Lotions on the market today, while effective, are based on limited understanding of how the active ingredients smooth and moisturize the skin. Research therefore has been based primarily on efforts to improve traditional, successful combinations of surfactants, oils and active substances.
In a sense, studying new structures would remove some guess work in manufacturing effective lotions because it would remove an unknown from the equation: companies could work from the template of the new structure rather than one that is nonexistent or, at best, flawed.
"I think it would be possible to save some lab work by knowing what is going on, and it could open a new marketing opportunity," Friberg said.
As for cosmetics, tradition has a head start on science, Friberg said. For instance, the latest interest in skin care -- hydroxy acids, the active ingredients in anti-wrinkle creams -- have been used for thousands of years and date back to Cleopatra, whose bath contained lactic acid (a hydroxy acid) which the classic beauty obtained from sour donkey milk.
"Cosmetics have a very long period of use," he said. "The companies involved have a tremendous knowledge of what works and doesn't work just from experience. Once they show somewhere that something works, then everyone jumps on the bandwagon." ###
The American Chemical Society -- the world's largest scientific society -- is a nonprofit organization chartered by the U.S. Congress and a global leader in providing access to chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in Washington, D.C., and Columbus, Ohio.
-- John Simpson
Note for reporters' use only: For full information about the Boston meeting, including access to abstracts of more than 9,500 scientific papers and hundreds of non-technical summaries, visit http://www.acspresscenter.org. News release images are available at http://chemistry.org/bostonnews/images.html.
The poster on this research, COLL 418, will be presented at 9:00 a.m., Wednesday, Aug. 22, at the Boston Convention & Exhibition Center, Room 153B, during the session "Surfactants and Polymers for Personal, Home and Health Care."
Stig E. Friberg, Ph.D., is a chemist at the University of Virginia in Charlottesville, Va.
Contact: Charmayne Marsh c_marsh@acs.org
617-954-3488 (Boston, Aug. 19-23)
202-872-4400 (Washington, D.C.)
Michael Bernstein m_bernstein@acs.org
American Chemical Society 234th National Meeting | 科技 |
2017-09/1580/en_head.json.gz/1949 | H.M.'s brain yields new evidence
3-D model of famous amnesiac's brain helps illuminate human memory
IMAGE: Jacopo Annese prepares H.M.'s brain, preserved in gelatin, for freezing and subsequent cutting into 2,401 thin tissue slices.
view more Credit: UC San Diego School of Medicine
During his lifetime, Henry G. Molaison (H.M.) was the best-known and possibly the most-studied patient of modern neuroscience. Now, thanks to the postmortem study of his brain, based on histological sectioning and digital three-dimensional construction led by Jacopo Annese, PhD, at the University of California, San Diego, scientists around the globe will finally have insight into the neurological basis of the case that defined modern studies of human memory. The microscopic anatomical model of the whole brain and detailed 3D measurements of the medial temporal lobe (MTL) region are described in a paper to be published online in Nature Communications on January 28. H.M. was an epileptic patient whose severe and almost total amnesia was the unexpected result of a bilateral surgical ablation of the MTL, including the hippocampus, in 1953. Until his death in 2008, the purity and severity of H.M.'s memory impairment, along with his willingness to participate in continual testing, made his case uniquely influential. While his intellectual abilities, personality, language and perceptual skills remained intact, he was unable to store information in long-term memory. After his brain operation, H.M. was profoundly impaired in forming new declarative memories. This unfortunate outcome became the catalyst for over 50 years of scientific discoveries (and thousands of publications) that have radically changed scientists' basic understanding of memory function. His case was significant because it provided the first conclusive evidence for the involvement of the hippocampus in forming new memories.
In December 2009, Annese and his team dissected H.M.'s brain into 2,401 thin tissue slices that were then preserved cryogenically in serial order. While the brain was being sliced, the researchers collected an unabridged series of digital images of the surface of the block, corresponding to each tissue section. These images were archived and used to create a three-dimensional microscopic model of the whole brain. The model of H.M.'s brain contains clues to help understand the surgery performed in 1953, and the level of sampling and image quality afforded by this study represents a significant advance over the MRI scans performed with H.M. when he was alive.
"Our goal was to create this 3D model so we could revisit, by virtual dissection, the original surgical procedure and support retrospective studies by providing clear anatomical verification of the original brain lesion and the pathological state of the surround areas of H.M.'s brain," Annese said. But the study reveals a small, circumscribed lesion in the left orbitofrontal cortex that had been previously undiscovered, showing the power of the technique. Based on the 3D geometry of the lesion and the type of the lobectomy that was performed in 1953, Annese thinks this lesion was very likely created by Scoville during the surgery. The findings reported in Nature Communications constitute new evidence that may help scientists today understand more fully the consequences of H.M.'s operation in the context of modern knowledge on memory of the functional anatomy of the hippocampus.
Annese and his team at UCSD also created a web-based atlas of H.M.'s brain, meant to support collaboration and preserve an archive of anatomical images relative to the case. The atlas contains structural delineations and digitized versions of the stained histological slides that can be viewed at the cellular-level using Google maps, a level of detail not seen before.###
For more information on the H.M. project and the Brain Observatory, visit: thebrainobservatory.org/hm2. To view a Google maps version of the slice featured in the paper (Fig. 5), go to: https://thedigitalbrainlibrary.org/hm_web/teaser.php Funding for the study was provided by grants from the National Science Foundation (NSF-SGER 0714660), the Dana Foundation Brain and Immuno-Imaging Award, and by private contributions from viewers of the web broadcast of the dissection. In the course of the study, Annese was supported in part by research grants from the National Eye Institute, R01EY018359-02 and ARRA R01 EY018359-02S1 and the National Institute of Mental Health, R01MH084756.
Scott LaFee
slafee@ucsd.edu
@UCSanDiego
http://www.ucsd.edu More on this News Release
National Science Foundation, Dana Foundation Brain, Immuno-Imaging Award
H.M. Brain Prep (IMAGE)
view more H.M. Frozen Brain (IMAGE)
view more Annese with Slides (IMAGE)
view more More in Medicine & Health | 科技 |
2017-09/1580/en_head.json.gz/1954 | Air Toxics
It All Adds Up
Emissions Calculator Toolkit
Air Quality Contacts
For more information, please contact Mark Glaze.
FHWA → Environment
→ Air Quality
→ CMAQ
→ Research
→ Advancing Mobility
CMAQ: Advancing Mobility and Air Quality
NAVIGATOR-Advanced Transportation Management System (ATMS)-Atlanta, Georgia
Type of Project
Traffic Flow Project Cost
CMAQ Cost: $54 million Total Project Cost: $140 million Context and Background
The Atlanta region is known for its traffic congestion; it is one of the fastest growing metropolitan areas in the country. Atlanta's population nearly doubled from 1.6 million to 3.0 million between 1982 and 2000, and system-wide daily vehicle miles traveled tripled over this period.26 As host to the 1996 Olympic Summer Games, Atlanta was expected to draw some two million visitors plus thousands of athletes, coaches, and officials from around the world. Given the region's reputation for traffic congestion, regional leaders were concerned about accommodating the influx of people and traffic for the Olympics. Few things could destroy the excitement of the Olympics-not to mention the reputation of the region and economic opportunities stemming from the event-faster than transportation gridlock. Solutions were needed to improve mobility for the games, as well as to promote the continued economic well-being of the region.
The Advanced Transportation Management System, or NAVIGATOR, was developed to help better manage traffic flow and provide real-time traffic information to improve transportation decisions and public information. NAVIGATOR is a computerized transportation communication system that employs fiber optic technology to gather traffic information. It uses video detection, radar detectors, and more than 450 closed-circuit television cameras to monitor traffic flow. The system enables control center managers to detect traffic incidents and congestion rapidly, and subsequently dispatch Highway Emergency Response Operators (HEROs). Five ramp meters are used to control highway traffic flow, and information technologies (such as 67 changeable message signs, the Internet, and 140 information kiosks) help provide motorists with real-time traffic information. In developing the system, more than 400 traffic intersections were upgraded to improve signal coordination throughout Atlanta and the metropolitan region. In addition to elements that improve highway traffic flow, the system includes transit management, electronic fare payment, and multimodal traveler information. The Metropolitan Atlanta Rapid Transit Authority (MARTA) has access to information generated by the ATMS and shares information on road conditions. For example, information on an accident that is radioed into MARTA by bus drivers is available to ATMS to help manage traffic patterns and incident response.
The system is housed in a $14 million transportation management center (TMC). Operated by the Georgia Department of Transportation (and centrally located in Atlanta on the same compound as the Georgia State Patrol and the Georgia Emergency Management Agency), the TMC serves as the control center for transportation emergencies. Having all surveillance and control functions under one roof facilitates decision-making and helps Atlanta's transportation officials more effectively manage the day-to-day demands of the transportation system. The TMC is linked to seven regional Transportation Control Centers in Clayton, Cobb, Dekalb, Fulton, and Gwinnett counties, the City of Atlanta, and MARTA. These satellite facilities and the TMC monitor 90 miles of interstate highway in the Atlanta region and represent the forefront of ITS traffic data gathering, communications, analysis, and incident-response activities.
I-85 and I-285 intersection north of Atlanta.
Results and Status
The incident management components of the system resulted in substantial savings in traveler delay. Using conservative estimates, Georgia DOT estimates that the incident management components of NAVIGATOR have reduced the average incident duration by 23 minutes, from an average of 64 minutes to 41 minutes. All elements of incident management are faster: incident detection and verification are faster due to traffic camera coverage; response identification and dispatch are speeded up by the computer system; and response time and clearance are also faster due to the HEROs (previously, local police and fire agencies responded to freeway incidents). It is estimated that, in total, the incident management components of NAVIGATOR have resulted in nearly 3.2 million hours in reduced delay time per year for travelers on Atlanta's highways. The delay savings accrued mostly during the peak hours of traffic, with 6:00 a.m. to 10:00 a.m. delay reductions of 1.3 million hours and 3:00 p.m. to 7:00 p.m. reductions of 1.9 million hours. These savings have resulted in a cost savings of $45 million per year for travelers.27
Beyond the incident management components, NAVIGATOR also provided motorists with information to make informed decisions regarding their traveling options, including ways to avoid spending time stuck in traffic delays. The system also improved the reliability of transit schedule information and decreased traveler waiting time. The Georgia DOT suggests that other benefits of the system include improved roadway safety, reduced air pollution, and more efficient use of emergency services.
Estimated Emissions Benefits
614 kg/day VOC, 578 kg/day NOx Contact
Mark Demidovich
Georgia Department of Transportation
404-635-8009 26 Texas Transportation Institute. 2002 Urban Mobility Study. On-line at: mobility.tamu.edu/ums/.
27 Presley, Michael and Katherine Wyrosdick. Calculating Benefits for NAVIGATOR, Georgia's Intelligent Transportation System. Atlanta: Georgia Department of Transportation, September 23, 1998. Top << | 科技 |
2017-09/1580/en_head.json.gz/1994 | / Department of Civil Engineering
Lawrence Tech researchers seek better ways to build bridges
Testing at Lawrence Technological University on the limits of ultra-high-performance concrete could bring major changes to the way highway bridges are built in states with cold climates.
LTU’s Dean of Engineering Nabil Grace and his research team are studying alternative materials for the Michigan Department of Transportation and the state transportation agencies of Oregon, Minnesota, Wisconsin and Iowa. LTU was awarded three-year, $349,000 contract in 2012 through the pooled fund of the Federal Highway Administration to evaluate alternative materials and new approaches to building highway bridges.
On Feb. 5, civil engineering researchers at LTU conducted a shear test on a new kind of concrete patented by LaFarge that is considered almost as strong as steel – so strong that the researchers believe rebar stirrups are no longer needed inside the concrete to guard against stresses that can lead to shearing. They tested an innovative hybrid bridge beam made of high-strength concrete in the flexural zone and ultra-high-performance concrete in the shear zone.
They found that it took nearly almost 95,000 pounds to break a beam made with ultra-high-performance concrete without stirrups compared to a beam made with a high-performance concrete with traditional rebar stirrups now used for building bridges.
The experimental concrete could be combined with alternative materials like carbon fiber to replace the traditional steel reinforcement to create bridges that can last 100 years – twice as long as conventional bridges – with much less maintenance.
The Lawrence Tech researchers will share their data with MDOT and the agencies from the other four states, and the results could lead to innovations in bridge construction.
For more than two decades, LTU’s Dean of Engineering Nabil Grace has been researching ways to build longer-lasting highway bridges by replacing steel reinforcement with alternative materials.
Lawrence Tech research was behind the construction of first bridge in the United States built with carbon in place of steel in 2001. Numerous sensors placed on the bridge measure the performance under the stresses of heavy traffic in all the climate changes that Michigan has to offer. That field data has been compared to the test results at LTU’s Center for Innovative Materials Research. | 科技 |
2017-09/1580/en_head.json.gz/2067 | Melt Extraction from the Mantle Beneath Mid-Ocean RidgesBy Peter Kelemen :: Originally published online March 1, 1998 : In print Vol. 41, No. 1, 1998TOPICS:Earth's InteriorMid-Ocean RidgesPlate Tectonics
SHARE THIS: TOOLS: Spring 1998 — As the oceanic plates move apart at mid-ocean ridges, rocks from Earth’s mantle, far below, rise to fill the void, mostly via slow plastic flow. As the rock approaches the top of its journey, however, partial melting occurs, so that the upper 6 kilometers of oceanic crust are composed of melts, which both erupt on the seafloor as lava and crystallize beneath the surface to form what are known as “plutonic” rocks.
Melt forms in tiny “pores” along the boundaries between different crystals, and the pores form an interconnected network of tubes 1 to 10 microns in diameter. Since the melts are less dense than the residual solids, they are relatively buoyant and move upward faster than the rest of the upwelling mantle. The exact process by which very small amounts of melt rise to form voluminous lava flows is the subject of some controversy. We do know that:
1) melts must rise through the upper 30 kilometers of the mantle without equilibrating with the surrounding rocks, and
2) melt forms in a region several hundred kilometers wide and then coalesces into a narrow band just a few kilometers wide—the mid-ocean ridge—where igneous crust is formed by crystallization of magma.
Studying processes that take place within the earth is not easy, but plate motions occasionally thrust pieces of crust to the surface that provide windows to the interior: Outcrops of Earth’s lower crust and upper mantle are exposed in ophiolites, slices of oceanic plates that are found on land. They generally occur in mountain belts along continental margins and are usually tilted, so that in the largest ophiolites the erosional surface exposes rocks that ordinarily reside 10 or even 20 kilometers below the seafloor. We turned to ophiolites to address the question of how slow, porous flow preserves disequilibrium between melts and the upper 30 kilometers of residual mantle rocks.
The process by which ophiolites are thrust onto the continental margins is uncertain. This raises a basic issue: The rocks in ophiolites are certainly formed at submarine spreading centers similar to the mid-ocean ridges, but they may be exceptional in their composition or structure—after all, “normal” oceanic crust is under water! Furthermore, like all exposures of ancient rocks, ophiolites only preserve an indistinct record of their formation. For questions regarding how representative they are, how hot they were, how fast they formed, scientists will always look to direct observations of active mid-ocean ridges.
However, rocks in ophiolites closely resemble the rocks sampled by dredging and drilling in the deep ocean basins, and the vertical structure of ophiolites corresponds well to the structure of oceanic plates inferred from geophysical measurements. Because they expose very deep rocks, and because their three-dimensional structure can be determined directly by geologic mapping, ophiolites provide a complete view of the oceanic plates’ internal geometry that is unmatched by any sampling or imaging technique used at sea. For example, the deepest holes so far drilled into submarine oceanic plates are 1.5 to 2.5 kilometers deep, about 10 times shallower than the deepest exposures in large ophiolites. And drill core is only 6 centimeters in diameter, whereas deep exposures in ophiolites extend for hundreds of thousands of square kilometers.
Because of these factors, there is an ongoing dialogue between scientists conducting seagoing research and those involved in ophiolite investigations that is essential to our understanding of the structure and genesis of oceanic plates.
Several types of melt extraction feature can be observed in the mantle section of an ophiolite located in Oman. They range from “dikes,” fractures filled with minerals crystallized from a melt, to “replacive dunites,” rocks composed only of the mineral olivine. Both of these types of feature are hosted within the normal mantle rocks, called peridotites, that are composed mainly of the minerals olivine and pyroxene. Our work focuses on the replacive dunites, which are formed when melt migrating by porous flow, saturated only in olivine, dissolves all the pyroxene from the rock. Dunites in the Oman mantle section show a deformation pattern that is sub-parallel to the paleo-seafloor, like the surrounding peridotites, indicating that the dunites must have formed in the region of mantle upwelling beneath the spreading ridge.
Our data show that replacive dunites in Oman formed within and around conduits of focused porous flow of melt deep in the mantle, and that the dunites were in equilibrium with the melts that formed the crust. Focused porous flow through dunites satisfies the constraint that melt must be transported through the upper 30 kilometers of the mantle without equilibrating with residual perido-tites. The width of the largest dunites exceeds 50 meters. Melts moving by porous flow in the interiors of such large dunites are separated from residual peridotite by many meters of dunite and thus escape chemical interaction with residual peridotite even though the melts are moving slowly.
How do dunites form? One possibility is that they simply arise as a result of reaction during porous flow. In the late 1980s, a group of hydrologists at the University of Indiana, led by Peter Ortoleva, conducted theoretical studies of porous flow of a solvent through partially soluble rock. They found that the coupling of flow and dissolution leads to the formation of high porosity channels that focus porous flow. Small perturbations in the initial porosity will grow exponentially because of a positive feedback loop: More fluid flow passes through areas with higher porosity, increasing the local dissolution rate, which increases the porosity, which in turn leads to yet higher flow velocities.
Ortoleva’s group and several other labs developed a “solution front” theory, in which undersaturated liquid comes into contact with the soluble porous medium along a planar front perpendicular to the liquid flow. MIT/WHOI Joint Program graduate student Einat Aharonov used sophisticated mathematics and new modeling techniques to show that even in a system where the liquid and solids maintain equilibrium everywhere, if the solubility of the solid in the liquid increases downstream, the result will be the exponential growth of dissolution channels.
Aharonov’s analysis also predicted the spacing between the fastest growing channels in the initial stages of channel growth. This spacing is proportional to the average flow velocity. As channels grow and become more permeable, the flow velocity increases. Based on this result, we predicted that over long time periods and long distances, natural systems would also produce networks with decreasing numbers of channels as a function of increasing time, and as a function of distance downstream. Such a channel network, coalescing downstream, is schematically illustrated in Figure 1. We called this picture Einat’s Castle.
Aharonov’s calculations and simulations were focused on the initial, infinitesimal stages of channel growth. More recently, Marc Spiegelman (Lamont-Doherty Earth Observatory, Columbia University) has modeled the process continuing to finite times. Many features that are transient in the initial stages reach steady state at longer times. Most importantly, the number of channels vs. distance downstream becomes steady, producing a coalescing drainage network (Figure 2) that is remarkably similar to the simple prediction in Einat’s Castle.
As for many coalescing channel systems in nature, the networks in Figures 1 and 2 show a “power law” relationship between the number of channels and the flux (volume transported per unit time) in the channels (Figures 3A & B). This arises because the flux doubles when two channels meet to form a larger one downstream, and this doubling is multiplied at each junction. This explains the slope in Figure 3A for Einat’s castle.
It is interesting to note that Theodore Wilson (University of Minnesota) in 1976 found the same kind of relationship, with conservation of flux downstream in human bronchial tubes. Their geometry is governed by an additional constraint that the cross-sectional area of the tubes is conserved at junctions. Such a branching geometry was first proposed by Leonardo da Vinci to account for the branching properties of some kinds of trees. The combination of downstream conservation of flux and cross-sectional area produces a power law relationship between the number and radius of the tubes in such a branching system (Figure 3C).
None of this “proves” that the dunites we observed in the Oman ophiolite formed solely as a result of dissolution channeling. In addition to dissolution channeling, dunite formation may also have been influenced by other processes including focusing of melt in zones of localized deformation, “suction” of melt into the region of plate separation beneath ridges, focusing of melt flow beneath permeability barriers, especially near the top of the upwelling mantle column, and perhaps the formation of dunite in porous reaction zones around melt-filled cracks. All of these processes may operate—in different times and places—and it is not yet clear which are dominant in melt extraction from the mantle beneath oceanic spreading ridges.
It was with the hope of providing vital new data to discriminate between these possibilities that we conceived the idea of taking photographs from a blimp. We reasoned that the preferred channel spacings and coalescing networks that arise in simple models of porous flow through a soluble solid matrix might be found in the mantle dunites of the Washington Cascades. If a regular spatial distribution of dunites existed and could be photographed by a blimp-mounted camera, it might indicate an important role for dissolution channeling during dunite formation.
We were inspired by detailed, photomosaic maps of RMS Titanic made by the remotely operated vehicles Jason and Argo, and reasoned that we should be able to produce similar maps on land. In 1996, WHOI Associate Scientist Greg Hirth and I received a WHOI Mellon Independent Study Award that allowed us to purchase an 8-meter-long blimp, 3 meters in diameter, from AdverBlimp, in Sioux Falls, South Dakota. It is similar to blimps that fly over shopping malls and car dealerships, announcing a sale or the opening of a new showroom. BOLO (Blimp for On Land Oceanography) can lift about 20 kilograms. We used a helicopter to fly helium into a backcountry campsite, inflated the blimp, hung a camera beneath it, and photographed rock outcrops in alpine meadows at about 2,000 meters elevation.
In geology it is impossible to escape the influence of history. Thus, we did not expect to reveal a simple, coalescing network. The mantle that will undergo melting as it rises beneath a spreading ridge has a complicated prior history that began with accretion of material that formed Earth, and has continued through many episodes of crustal formation and recycling. This gives rise to compositional variations, which in turn produce varying amounts of melt. In addition, the crystals that form the mantle have been stretched and aligned, “lineated,” by billions of years of convection. Because of these processes, melt flow networks encounter pre-existing structures that guide their geometry, making predictions of their patterns difficult. The inescapable effects of history, potentially unique for each specific region, constitute a major barrier to theoretical prediction in the earth sciences. A more useful application of a model such as ours is to try to gain insight into general principles governing coalescing networks of porous channels. One such principle is a tendency toward exponentially increasing flux with an exponentially decreasing number of downstream channels (Figure 3).
Figure 4 illustrates one photo frame taken from the blimp along with a combination of about 50 such frames into a digital photomosaic. Creation of this mosaic was greatly aided by the fact that the outcrops are almost perfectly flat! We made all our photographs with the camera lens oriented perpendicular to the plane of the outcrop, from a constant distance away. Furthermore, we were lucky in that all of the channels in the dunites of this area are oriented perpendicular to the plane of the outcrop, which greatly eases determination of their true size in three dimensions.
In Figure 4, the channels are in black, residual mantle peridotite is grey, and areas covered by soil or vegetation are white. Having created this image, our next step was to measure the size and number of dunites. We did this by adding a grid of straight lines, aligned perpendicular to the general trend of the dunites. We asked our computer to find all the intersections between the lines and the dunites, and measure their length. Then we combined and analyzed the measurements in terms of the number of intersections with a given length. These data are shown in Figure 5 for several outcrops.
It is striking that the width of the dunites (intersection length) has a power law relationship to their number, with a negative slope, as in the size/frequency plot for human bronchial tubes in Figure 3C. The presence of such a regular structure suggests that all the dunites were active at once, and formed a coalescing network in which small conduits of focused flow within many small dunites were connected to a few large conduits within large dunites. The constancy of the slope from outcrop to outcrop gives us confidence that the measurement is “robust”—not an artifact due to a limited number of observations.
Human bronchial tubes are open cylinders, whereas conduits within dunites in the mantle are channels of high porosity. Thus, the slopes in Figures 3C and 5 should not be the same. We can reproduce the observed slope for the dunites in Figure 5 with a simple scaling relationship in which flux is conserved, channels join in pairs with similar sizes, and the porosity varies from 5 percent in the biggest dunites to 0.5 percent in the smallest ones.
Much more data analysis will be required in order to adequately test the theory that a chemical dissolution process is responsible for this structure. However, the linear relationship in Figure 5 is exciting for many reasons. For melt extraction, it suggests that the relationship between width and number of channels might be generally predictable. If so, we can extrapolate this relationship to a much larger area, such as the cross-sectional area of the mantle beneath an active mid-ocean spreading ridge, as schematically illustrated in Figure 6. This approach could be valuable because we will never be able to observe such a huge area directly. We can then ask, are there dunite channels wide enough and numerous enough to account for chemically isolated transport of melt from deep in the melting region to the base of the crust? Our provisional answer is “yes.”
Looking beyond the specialized field of melt extraction, our results on the mechanisms and consequences of flow focusing become even more exciting. In 1976, Wilson emphasized the fundamentally important point that the coalescing network of bronchial tubes in the human lung is optimally efficient. For a system with a single large inlet (your throat), and the requirement to disperse air to the surface of thousands of capillaries less than a millimeter in diameter, the geometry of Figure 3C minimizes energy loss due to friction of moving air along the tube walls. In terms of thermodynamics, the bronchial network minimizes entropy (the amount of energy unavailable for useful work in a system undergoing change).
Perhaps it is not surprising that human lungs are optimally efficient. After all, organisms on Earth have had billions years of trial and error to work out the bugs in their design. However, some chemical systems develop similar structures in minutes. Nobel-prize-winning chemist Ilya Prigogine (University of Texas at Austin and International Solvay Institutes, Belgium) showed that, on laboratory time scales, some systems, which are open to chemical transport and which remain close to equilibrium, tend to find a steady state that minimizes entropy production. Again, perhaps this is understandable since chemical organization is the statistical result of countless molecular interactions, each of which happens in nanoseconds.
It may be that a variety of Earth systems—porous channels, mantle convection, rivers, faults—all tend to develop geometries that minimize entropy production. They might not always reach an optimal structure, given the constraints of geologic history. However, just understanding this tendency could lead to the development of valuable predictive tools, and provide a unifying factor in the study of disparate natural phenomena.
Initial research described in this article was supported by the National Science Foundation. Purchase and use of BOLO (Blimp for On-Land Oceanography) in the Washington Cascades was supported by a WHOI Mellon Independent Study Award.
Peter Kelemen has been at WHOI, as a postdoc and then a scientist, for 10 years. During this time he and his wife, Rachel Cox, have acquired a house, two kids, and Rachel’s Ph.D. in physiology from Boston University Medical School. During the initial four of his six years as an undergraduate, Peter was an English and philosophy major at Dartmouth College. He then realized he would need to get a job when he graduated. In the meantime, he had learned technical climbing techniques. He reasoned that it would be best to work outside in the mountains, and so switched to a major in earth sciences. In 1980, Peter and some friends founded Dihedral Exploration, a consulting company specializing in “extreme terrain mineral exploration.” From 1980 to 1991, he split his time between geological research and mineral exploration, in the process obtaining a Ph.D. from the University of Washington. As a mineral exploration consultant and research scientist, Peter has been fortunate to work in the mountains of California, Oregon, Washington, British Columbia, Peru, the Yukon, Alaska, the Indian Himalaya and Karakorum Ranges, East Greenland, and even along the Mid-Atlantic Ridge, as well as in the Oman ophiolite.
Geology & Geophysics Dept., mantle, melt extraction Woods Hole Oceanographic Institution is the world's leading non-profit oceanographic research organization. Our mission is to explore and understand the ocean and to educate scientists, students, decision-makers, and the public. About | 科技 |
2017-09/1580/en_head.json.gz/2114 | 128f1368-9123-11e6-8df8-d3778b55a923
Visionary: author Jules Verne wrote about the first undersea cable in '20,000 Leagues Under the Sea'
Undersea cables span the globe to send more data than satellites
By Geof Wheelwright
It was in the mid-1800s that communications pioneers first laid cables on the seabed to create the world’s telegraph — and later telephonic — infrastructure.Visionary French writer Jules Verne was so enraptured by the idea of undersea telegraph cables that one features in his 1870 novel 20,000 Leagues Under the Sea, when Captain Nemo’s crew find the remains of the world’s first transatlantic cable, laid in 1855, off the shores of Newfoundland.
Now, 146 years after Verne’s book, undersea cables are thriving, according to Alan Mauldin, research director at the Washington DC offices of TeleGeography, an international telecommunications market research company.Citing data published by the US Federal Communications Commission, he says the technology is the dominant method of international telecommunications, and about 99 per cent of all intercontinental telecoms traffic — data, phone calls, texts, emails — is transmitted via submarine cables. There is big growth in the sector now because of the fast-rising requirements of cloud-based technology businesses and their customers, as well as the demand for greater capacity from financial services companies seeking the smallest possible delays in transaction times (known as latency). Mr Mauldin says that demand for global bandwidth is growing at up to 40 per cent year.There was a time when cables were laid and controlled by large consortiums of national telecommunication carriers, but this is changing. Microsoft and Facebook announced this year they are jointly building Marea (Spanish for “tide”), a 6,600km scheme billed as the “highest-capacity subsea cable to ever cross the Atlantic”. This cable is faster per second by 16m-20m times than a home internet connection and is due to be completed by October 2017. It will be operated by Telxius, the infrastructure unit of Spain’s Telefónica, and run from Virginia in the US to Bilbao, Spain, and then to network hubs in Europe, Africa, the Middle East and Asia. The companies have not provided costings for the project.Meanwhile, Google was part of a $300m consortium that backed the 9,000km Faster cable project linking the US West Coast to Chiba and Mie prefectures in Japan, completed in June. Three months later, Google extended the cable to Taiwan to increase the speed and reliability of its services.Peter Jamieson, chairman of the European Subsea Cables Association, an industry group, welcomes the investment of large technology companies. He says many systems are still run by older companies such as BT, AT&T, Telefónica, Vodafone and large communications providers like Level 3, Hibernia Networks and Global Cloud Xchange.After a long period in which the incumbents have been adding little capacity, established companies are starting to do more, Mr Jamieson says. Vodafone this year went live with a cable from Bengal to Southeast Asia, South Asia and the Middle East.In September 2015, US-owned Hibernia Networks rolled out Hibernia Express, an “ultra low latency service” aimed at the financial services industry. It boasted of being able to provide the lowest latency between New York and London, and said it was a 5 millisecond improvement compared to existing high-speed networks which now link businesses in the two global financial centres. Meanwhile, Aqua Comms, based in Dublin, announced in January that its America Europe Connect subsea fibre-optic cable network from County Mayo to Long Island was live. The company says this too offers reduced latency.Mr Jamieson adds that Google, Facebook, Amazon and Microsoft are not content with leasing capacity on services managed by others.“They want to control their own traffic and therefore they are co-financing the new round of submarine cables.”He rejects the idea that satellite technology is a competitor to undersea cables for carrying digital communications. “Satellites cannot compete with the capacity required, the speed or the latency,” he says. But investment in the cable industry is not for those seeking a quick return. Backers need patience and deep pockets to undertake the time-consuming process of building new cables, according to Paul McCann, who runs an undersea cable consultancy in Sydney, Australia. “It is a long-term investment, not a ‘get rich quick’ opportunity,” says Mr McCann.Mr Mauldin agrees, and says investors who are likely to get frustrated with the process should best avoid it. “From having the idea, to finding a route, to doing a survey of the marine area, to getting the environmental permits, building the cable and then laying the cable — usually it takes at least two years depending on where you are laying it,” he says. Additionally, working out when a completed project will actually be delivered is not an exact science, he adds. “Usually people announce one date and then add at least six months to it.” | 科技 |
2017-09/1580/en_head.json.gz/2397 | Related Program: Weekend Edition - Saturday New Projects Help 3-D Printing Materialize By editor
Jul 7, 2012 Related Program: Weekend Edition - Saturday TweetShareGoogle+Email Listen Listening...
Originally published on July 7, 2012 6:57 am Transcript SCOTT SIMON, HOST: You may have heard of 3-D printers. These are computer controlled machines that create three-dimensional objects from a variety of materials. They've been kind of a novelty for a while but now they are being discovered by everyday consumers. Jon Kalish reports. JON KALISH, BYLINE: Sean Hurley works for a software company called Autodesk. Not long ago the door on his clothes dryer at home developed a problem. It wouldn't stay shut, which made it impossible to use the dryer. SEAN HURLEY: The little clip when you shut the door that locks it is just a half-inch flat little piece of plastic with a little channel it. That's all it was. Thirty-nine bucks. KALISH: Yep, 39 bucks to replace a little piece of plastic. So Hurley used his computer to design the part and then made it on a 3-D printer. Problem solved. Hurley knew how to do this because he knows how to use software to make a 3-D model. But for those of us who don't have that kind of computer chops, Hurley's company makes free software that fills in the gap. HURLEY: 123D Catch allows somebody to use a regular camera and capture photos that generate a 3-D model. It's kinda like magic. KALISH: And if even that seems too involved, there are now multiple sites on the Internet that offer free 3-D models for a variety of consumer items. For example, if the headband on your headphones breaks, you can have a new one printed from one of these free files. And more and more of these files are being made available every day. New Yorker Duann Scott had one of those pricey Bugaboo baby strollers. When a part in the locking mechanism broke he was told it would cost $250 to fix it. Scott spent all of five minutes creating a 3-D model of the broken plastic part and had it printed in a stronger material: stainless steel. The cost: $25. DUANN SCOTT: It came back just under two weeks later and I put it in the stroller and it worked straight away. And I documented it so that anybody could fix their stroller using my experience. And I made the three files available for download for free so anyone else can repair it. And I don't get any money from it. I just want people to fix their stroller in the same way I could. KALISH: 3-D printers and other digital fabrication machines are allowing designers and consumers to bypass the traditional factory and create goods on a much smaller scale. But even if you skip the factory, you'll still need access to expensive digital fabrication technology such as laser cutters and CNC routers. Ponoko is one of several companies that can take a customer's own design, or one made by a professional, and fabricate consumer goods made from a variety of materials, including metals, wood and felt. The company has 15 different production facilities in the United States. At their Wellington, New Zealand headquarters Ponoko staffer Richard Borrett showed off a pair of stools made from birch plywood. RICHARD BORRETT: This is one of the first products that came to our showroom when we first launched CNC cutting. He's managed to nest two of the stools onto one sheet of birch ply. And that all slots together, no fasteners. It's a complete sort of flat-pack design and you pop the pieces out and slot it all together. (SOUNDBITE OF CNC MILLING MACHINE) KALISH: A few blocks from Ponoko's office in Wellington, parts for a wooden chair are being cut on a CNC milling machine at Victoria University. Two graduates of the university's design program started the SketchChair project, which allows people with absolutely no artistic or computer skills to design their own chairs. The open source software they use not only figures out how to construct the chair but also creates a computer file that tells a CNC router how to cut out the parts. Tiago Rorke is one of the designers behind SkechChair. TIAGO RORKE: We weren't so much interested in trying to create a brand of furniture. Like, we were a little bit more interested in taking the model of open source software development and apply that to product design. You know, it's not just the SketchChair software which will be available for free but the designs themselves. And so, it takes it away from being about the original sole authorship but more about people working collaboratively through a community and I guess sharing their ideas is a big part of it. KALISH: If you'd like to digitally fabricate a table to go with your chair, there are free designs available from the AtFab project, which was started by architects in Lexington, Kentucky. They also give away files to make cabinets and beds. Of course, most people don't have access to an expensive CNC milling machine, but there's a web site called 100 Thousand Garages that serves as a matchmaker for consumers and woodworkers who use these digital fabrication technologies. For NPR News, I'm Jon Kalish. Transcript provided by NPR, Copyright NPR.TweetShareGoogle+EmailView the discussion thread. © 2017 KRWG | 科技 |
2017-09/1580/en_head.json.gz/2489 | Qwest Communications Announces Contract Wins In Excess Of $30 Million For Broadband Internet Communications Services
Denver, May 7, 2001 ? Qwest Communications International Inc. (NYSE: Q), the broadband Internet communications company, today announced it has signed four new contracts valuing more than $30 million for various broadband Internet communications services. The contracts are with Albuquerque Public Schools in New Mexico; Denver International Airport (DIA); Fuzion Wireless Communications Inc., a subsidiary of Fuzion Technologies Group, Inc.; and Georgia Technology Authority (GTA), which oversees technology procurement for Georgia state government
?These new accounts represent continuing strong demand for broadband Internet communications services from Qwest,? said Joel Arnold, executive vice president of global business markets for Qwest. ?We believe the government and business sectors will continue to demand our broadband Internet and data services throughout the year, leaving us well positioned for growth at the local, national and global level.?
Qwest will provide high-speed Internet access, frame relay and asynchronous transfer mode (ATM) network services to Albuquerque?s 120 public schools. More than 85,000 students in grades K-12 will benefit from the contract between Qwest and the school district.
Qwest has contracted with DIA to provide more than 15 people to work onsite to support all voice and data communications systems at the airport. These systems are some of the most vital components to DIA as they control everything from the network that supports the airline computer systems to the core phone and data network and DIA?s closed-circuit and public-access television.
Qwest is providing the high-speed dedicated Internet access backbone to connect Fuzion?s hubs to central offices. Utilizing a hybrid of wireless and wireline technologies, Fuzion offers integrated broadband access solutions to address the high-speed connectivity needs of small-, medium- and enterprise-level businesses.
Georgia Technology Authority
In the agreement with GTA, Qwest will provide state agencies with domestic call origination and toll-free calling services, as well as calling cards. The state of Georgia will realize significant cost savings by purchasing these voice services from Qwest.
Qwest Communications International Inc. (NYSE: Q) is a leader in reliable, scalable and secure broadband Internet-based data, voice and image communications for businesses and consumers. The Qwest Macro Capacity® Fiber Network, designed with the newest optical networking equipment for speed and efficiency, spans more than 104,000 miles globally. For more information, please visit the Qwest web site at www.qwest.com.
This release may contain projections and other forward-looking statements that involve risks and uncertainties. These statements may differ materially from actual future events or results. Readers are referred to the documents filed by Qwest with the Securities and Exchange Commission, specifically the most recent reports which identify important risk factors that could cause actual results to differ from those contained in the forward-looking statements, including potential fluctuations in quarterly results, volatility of Qwest?s stock price, intense competition in the communications services market, changes in demand for Qwest?s products and services, dependence on new product development and acceleration of the deployment of advanced new services, such as broadband data, wireless and video services, which could require substantial expenditure of financial and other resources in excess of contemplated levels, higher than anticipated employee levels, capital expenditures and operating expenses, rapid and significant changes in technology and markets, adverse changes in the regulatory or legislative environment affecting Qwest?s business and delays in Qwest?s ability to provide interLATA services within its 14-state local service territory, failure to maintain rights of way, and failure to achieve the projected synergies and financial results expected to result from the acquisition of U S WEST timely or at all and difficulties in combining the operations of Qwest and U S WEST. This release may include analysts? estimates and other information prepared by third parties for which Qwest assumes no responsibility. Qwest undertakes no obligation to review or confirm analysts? expectations or estimates or to release publicly any revisions to any forward-looking statements to reflect events or circumstances after the date hereof or to reflect the occurrence of unanticipated events.
Matt Barkett
matt.barkett@qwest.com | 科技 |
2017-09/1580/en_head.json.gz/2572 | Hobby-Eberly Telescope Measures Masses of Two Stars with One Orbiting Planet
Hobby-Eberly,
An artist's conception of the Kepler-16 system, shown during one of its transit events. The larger star, with mass about 70 percent that of the Sun, contributes most of the total light from the system. The smaller star, with mass about 20 percent that of the Sun, contributes only 3 percent of the light. The planet is thought to be a gas giant, with mass roughly equal to that of Neptune.
Credit: NASA/JPL-Caltech/R. Hurt
A team of Penn State University astronomers has obtained very precise measurements of a pair of stars that are orbited by a planet -- like the stellar system of the fictional planet Tatooine in the movie Star Wars. The orbits of the stars and planet in the system, named Kepler-16, are aligned so that they eclipse or transit each other when observed from Earth. These new measurements will aid astronomers in understanding how stars and planetary systems form.
The Penn State astronomers used the spectroscopic capabilities of the Hobby-Eberly Telescope at McDonald Observatory to separate the light from the two Kepler-16 stars into individual wavelengths, which allowed them to precisely measure the masses of the stars. This system was discovered in 2011 by NASA's Kepler spacecraft, and was the first "eclipsing binary" star system conclusively shown to host a planet orbiting two stars. The Penn State measurements are among the most precise ever made for low-mass stars like those in Kepler-16, and also provide an important independent test of a sophisticated new technique for measuring masses from Kepler spacecraft data. "Precise measurements of stellar masses, radii, and system architectures provide important insights into how planets form," said the leader of the Penn State team, Center for Exoplanets and Habitable Worlds Postdoctoral Fellow Chad Bender. The research is published in the 1 June issue of the Astrophysical Journal Letters.
NASA's Kepler spacecraft is continuously monitoring the brightness of about 150,000 stars in order to search for Earth-mass planets. In 2011, a Kepler science team lead by Laurance Doyle at the SETI Institute combined precise timing of the Kepler-16 eclipses with a new sophisticated modeling technique to derive the masses of the two stars, which are about 70 percent and 20 percent that of the Sun, and of the orbiting planet. The technique also has been applied to several similar systems.
This graph shows the radial velocities measured with the Hobby-Eberly Telescope by the Penn State team, plotted against the orbital phase for the stars Kepler-16A and Kepler-16B, as well as velocities of the brighter star that were measured by the Kepler science team. By fitting these velocities with Kepler's Equations of Motion, the team was able to derive the stars' ratio of masses and also the individual masses. The bottom two panels show residuals to the fits, plotted as red and blue lines.
Credit: Bender et al, Penn State University
In the case of close binary stars, such as Kepler-16, light from the individual stars is blended together. "While the Kepler-16 modeling did provide a solid estimate of the masses of the stars, we wanted to separate the light from each star so they could be studied individually," said Assistant Professor of Astronomy Jason Wright, a member of the research team. The spectroscopy obtained by the Penn State researchers facilitates this disentangling, and yields the masses of the stars with a much simpler analysis.
"As the two Kepler-16 stars move in their 41-day orbit, their velocity relative to the Earth periodically changes, and we were able to track these motions by obtaining six high-resolution-spectroscopic observations with the Hobby-Eberly Telescope," said co-investigator and Assistant Professor of Astronomy and Astrophysics Suvrath Mahadevan. The team used these data to disentangle the light from the individual stars, which was a challenge because the star Kepler-16B contributes only about 3 percent of the total light from the system. "The resulting velocity measurements, combined with Kepler's Laws of Motion, directly give the masses of stars Kepler-16A and Kepler-16B with precisions of 2.5 percent and 1.5 percent, respectively," Mahadevan said.
"These mass measurements are among the most precise that have been made for low-mass stars," said Bender. "Models that describe the formation and evolution of stars and planets have improved considerably over the past decade, but making additional improvements requires new measurements of numerous stars with the precisions that we have achieved here." In addition, the Penn State results confirm the viability of the new technique originally used by the Kepler team, through completely independent data and a completely different analysis technique.
"Understanding the radii and masses of low-mass stars such as these is critical for the search for planets in habitable zones," Mahadevan said. The Kepler-16 measurements are the initial results from a much larger survey being led by Penn State to measure precise masses of more than 100 eclipsing binary stars discovered by Kepler.
"The Kepler mission is revealing at least as much about Sun-like and low-mass stars as it is about planets; it's really revolutionizing the field of stellar astrophysics," noted Wright. "These observations illustrate how measurements with the Hobby-Eberly Telescope and other ground-based telescopes amplify and extend the fantastic science Kepler is doing, and can teach us more about these stars and the planets that orbit them."
In addition to Bender, Mahadevan, and Wright, other members of the Penn State Kepler-16 team include Associate Professor Steinn Sigurdsson, Distinguished Senior Scholar and Professor Larry Ramsey, Distinguished Professor Donald Schneider, Postdoctoral Scholars Rohit Deshpande and Scott Fleming, and Graduate Students Arpita Roy and Ryan Terrien. All are members of the Penn State Department of Astronomy and Astrophysics and the Penn State Center for Exoplanets and Habitable Worlds.
The Hobby-Eberly Telescope is a joint project of the University of Texas at Austin, Penn State University, Stanford University, Ludwig-Maximilians-Universitat Munchen, and Georg-August-Universitat Gottingen.
A preprint of the paper is online at http://arxiv.org/abs/1205.0259. Funding for this research was provided by the Center for Exoplanets and Habitable Worlds, the NASA Astrobiology Institute, the Penn State Astrobiology Research Center, and the National Science Foundation.
[ C B / B K K ]
Chad Bender: cfb12@psu.edu, mobile 631-431-2967, office 814-863-4690
Jason Wright: jtwright@psu.edu, 814-863-8470
Hobby-Eberly Telescope Measures Masses of Two Stars with One Orbiting Planet (podcast)
"Where Will the Food Come From in a Hotter, More Crowded World?" is a Free Public Lecture on 21 January 2012
"The Global Pollinator Crisis" is a Free Public Lecture on 28 January 2012
"The Good Bugs: Why Agriculture Needs Microbes" is a Free Public Lecture on 4 February 2012
"Novel Solutions to Complex Diseases for Subsistence Agriculture" is Free Public Lecture on 11 February 2012
"Feeding the Future: From the Lab Bench to the Dinner Table" is Free Public Lecture on 18 February 2012
"Roots of the Second Green Revolution" is a Free Public Lecture on 25 February 2012 | 科技 |
2017-09/1580/en_head.json.gz/2643 | California's Genetically Engineered Food Label May Confuse More Than Inform By editor
May 14, 2012 TweetShareGoogle+Email Protesters demonstrate against the production of genetically modified food in front of a Monsanto facility in Davis, Calif., in March. The local protest was not specifically about labeling.
Randall Benton
/ MCT /Landov
Originally published on May 18, 2012 6:03 pm When Californians go to the polls in November, they will very likely have the chance to make California the first state in the nation to require labeling of genetically engineered food. That's according to California Right to Know, which filed a petition to force a statewide vote. And the group is pretty confident it will succeed. "Polls show that nine out of ten California voters agree that they want labeling," Stacy Malkan, spokeswoman for the group, tells The Salt. But a new analysis of the labeling initiative suggests that if it passes, it would create a complex mandate for food companies that may make it harder — not easier — for consumers to figure out what's really in their food. That's because the initiative muddies the definition of a "natural" food. The word "natural" on a food label is already pretty controversial. It's more of a marketing tool than anything else — seducing consumers into thinking it means healthier, or nearly organic, although it may simply mean minimally-processed and free from artificial ingredients. The federal government has so far declined to make the term clearer, which has led to many processed foods using the "natural" label. The activists behind the labeling initiative say they want California consumers to know what they're eating. So they're calling for any processed food or raw agricultural commodity (like corn) that has been or may have been partially or wholly produced with genetic engineering to be labeled as such. And they want to prevent processed foods with GE ingredients from using the "natural" label, too. But Peggy Lemaux, a cooperative extension specialist at the University of California, Berkeley who manages an informational website on biotechnology, says her analysis of the document concluded that "natural" — as the Right To Know group wants it — could be interpreted two different ways. One way is that processed foods could be labeled "natural" only if they are free of GE ingredients. But Lemaux says the initiative could also be interpreted as saying that no processed food can be labeled "natural", whether or not it is GE or contains GE ingredients. However, Malkan, of Right to Know, says the initiative merely intended to keep food with GE ingredients from being called "natural." "The language is clear that non-GE processed foods could still be labeled 'natural,'" says Malkan. Activists have been calling for labeling of GE food for many years, but recent petition drives and polls suggest support for labeling is greater than it has ever been. As we reported in March, a coalition calling itself Just Label It commissioned a survey from a national pollster, which found that 91 percent of voters favor labeling. Some 40 countries around the world now require labels for GM foods. But the U.S. Food and Drug Administration has maintained a firm stance since 2009 that GM labeling is unnecessary. The agency says genetically modified food is essentially the same as other food and poses no safety risk. Lemaux, who has done extensive reviews of the scientific literature on GE foods, agrees with the FDA. "This [labeling measure] is not going offer any additional safety to people; it's really not a food safety issue because there's no real evidence this stuff is unsafe," says Lemaux. What's more she says, the GE and natural labels may scare less savvy consumers away from affordable, healthful foods. And, as we've reported before, Americans really don't understand what genetically engineered food is all about. "If you're looking to know what's in your food, well there's a lot of stuff in your food, and there's already a lot of stuff on the label," says Lemaux. "And a lot of people already don't read the label." UPDATED EDITOR'S NOTE 5:00 pm, Wednesday: Peggy Lemaux initially told NPR she thought that the passage of the labeling initiative would mean that no processed food could be labeled "natural" unless it was a processed organic food. She later revised her analysis. In her second take, she concluded that the section on labeling "natural" could be interpreted different ways, and that the interpretation will ultimately be decided in the courts.Copyright 2012 National Public Radio. To see more, visit http://www.npr.org/. TweetShareGoogle+EmailView the discussion thread. © 2017 Tri States Public Radio | 科技 |
2017-09/1580/en_head.json.gz/2668 | El Nino Seen As Trigger For Violence In The Tropics By Jon Hamilton
Aug 24, 2011 TweetShareGoogle+Email This image shows the the above-normal water temperature in the Pacific Ocean during the December 1997 El Nino. Green-blue colors represent normal temperatures; dark red indicates hotter water.
Scientists say there's a link between climate and violent conflict. A statistical analysis of civil conflicts between 1950 and 2004 found that in tropical countries, conflicts were twice as likely to occur in El Nino years. The analysis appears in the journal Nature. El Nino occurs when there is unusually warm water in the Pacific Ocean near the equator. But it affects weather patterns in tropical countries around the globe. "Half the world's population experiences a completely different climate regime," says lead author Solomon Hsiang, a researcher at Princeton University. In most places, El Nino means the weather gets warmer and drier, sometimes for years. The opposite occurs with La Nina conditions, which occur when waters in the Pacific become unusually cool. Hsiang and his colleagues looked at 93 tropical countries and 82 other countries to see whether the El Nino-La Nina cycle affected civil conflict, which they defined as a new dispute between the government and an organized group that results in at least 25 battle-related deaths. In La Nina years, there was a 3 percent chance that a tropical country would have a civil conflict, Hsiang says. "When the global climate shifts into its relatively hotter and drier El Nino state," he says. "The rate of conflict jumps — it actually doubles all the way up to 6 percent." But not all countries were affected by El Nino, Hsiang says. "The countries that are most sensitive to El Nino are the poorer countries," he says. "Wealthier countries such as Brazil or Australia don't exhibit civil conflicts." There's also no effect in countries such as Libya, which are outside the tropical belt, Hsiang says Fighting Over Resources? Historians have long used anecdotal evidence to suggest that climate has played a role in many conflicts. For instance, the French Revolution occurred during an unusually hot summer, and a drought coincided with violent conflicts in Rwanda in the 1990s. In 2009, a controversial study suggested a link between local temperatures and the likelihood of conflicts in Africa. But many scientists felt the study had statistical weaknesses. The new study is far more robust, says Andy Solow, a statistician at the Woods Hole Oceanographic Institution in Massachusetts who wrote a commentary that accompanied the study. It's still not entirely clear why El Nino conditions would lead to conflict, Solow says. But he says it's easy to think of reasons the two might be related. "People don't just go to war because the weather changes," Solow says. "The effect of weather on human behavior is presumably operating through resource scarcity or food scarcity or something like that." That could have been a factor in both the French Revolution and violence in Rwanda. In both cases, hot, dry weather had caused food shortages. And Solow says El Nino is well known to affect crop yields in tropical countries. "A lot of these countries are poor and mainly agricultural," he says. "As climate conditions change, that can put stress on the agricultural system in those countries — also possibly on water resources and other resources. And that may lead to conflict." Researchers still need to figure out whether that's really what's going on with El Nino in the tropics, Solow says. Even without knowing precisely how El Nino affects violence, it may be useful to consider climate when trying to anticipate conflicts in certain countries, Hsiang says. For instance, the current troubles in Somalia are occurring at a time when climate conditions are contributing to food shortages. But Hsiang says being able to anticipate trouble doesn't necessarily mean other countries should step in to prevent conflict. "Some conflicts are important," he says. "So it's not up to us to say that the conflicts per se should be stopped." The goal should be anticipating the needs of innocent bystanders who suffer in these conflicts, Hsiang says.Copyright 2011 National Public Radio. To see more, visit http://www.npr.org/. TweetShareGoogle+EmailView the discussion thread. © 2017 WEKU | 科技 |
2017-09/1580/en_head.json.gz/2705 | Home > More of the World’s Vertebrates Are Facing Extinction, but Conservation Efforts Have Helped
More of the World’s Vertebrates Are Facing Extinction, but Conservation Efforts Have Helped
27 October 2010Kathy WrenNewsIncreasing numbers of birds, mammals and amphibians have moved closer to extinction in the last several decades—but not as far as they would have if no conservation measures had been enacted, researchers report in ScienceExpress this week.
The paper was published in conjunction with the 10th Conference of the Parties to the Convention on Biological Diversity in Nagoya, Japan, where world leaders are discussing the failure to meet the convention’s targets for 2010 and negotiating a revised plan for tackling biodiversity loss and new targets for 2020.
To assess the status of the world’s vertebrates, Michael Hoffmann of the International Union for the Conservation of Nature (IUCN) and a large, international team of coauthors analyzed data for over 25,000 vertebrate species categorized on the IUCN Red List of Threatened Species.
“We document just how severe the situation is for vertebrates, the species of greatest interest to people, but we also show that the situation is not futile and that the loss of populations and species along with the benefits we accrue from them can be halted and reversed,” said co-author Stuart Butchart of BirdLife International.
Global patterns of threat, for land (terrestrial and freshwater, in brown) and marine (in blue) vertebrates, based on the number of globally threatened species in total. | Image courtesy of Science/AAASView a larger version of this image.
The scientists report that one-fifth of species is classified as threatened, and this figure is increasing. On average, 52 species of mammal, birds and amphibian move one category closer to extinction each year.
The tropics, especially Southeast Asia, are home to the highest concentrations of threatened animals, and the situation for amphibians is particularly serious. Most declines are reversible, but in 16% of cases they have led to extinction.
The researchers also asked whether conservation efforts such as establishing protected areas and adopting national legislation have made any measurable contribution to preserving biodiversity. By looking at species whose conservation status has improved in response to some type of conservation measure, Hoffmann and colleagues estimate that overall declines would have been approximately 18% worse without any conservation actions.
“And hence, even though as we’ve been hearing the Convention on Biological Diversity‘s target for 2010 hasn’t been met, this work provides good evidence that conservation actions can work and that if stepped up there is still realistic hopes of reducing some of the drivers of the biodiversity decline,” said Andrew Sugden, deputy and international managing editor at Science.
A set of projections in an accompanying Review article also forecasts biodiversity declines during the 21st century, but with a wide range of possible outcomes. This broad range arises because we have significant opportunities to intervene through better policies, and because scientific projections include large uncertainties, which is an urgent problem in itself, according to Henrique Pereira of Universidade de Lisboa, Portugal, and coauthors.
The International Union for Conservation of Nature, with the cooperation of AAAS/Science, organized a news conference on the new research in Nagoya, in conjunction with the Convention on Biodiversity meeting. An international news teleconference was organized by AAAS/Science in cooperation with IUCN.
Read the abstract for “The Impact of Conservation on the Status of the World’s Vertebrates.”
Read the accompanying Review article, “Scenarios for Global Biodiversity in the 21st Century.”
Read the transcript of or Listen to the international news briefing on the new research.
Source URL: https://www.aaas.org/pU3 | 科技 |
2017-09/1580/en_head.json.gz/2806 | At MIT, the presence of pastness - The Boston Globe
Theater & art
Photography Review
At MIT, the presence of pastness
By Mark Feeney
Globe Staff May 06, 2014
“Close-up of a Man With Piercing Eyes,” Geo. H. Dresser’s albumen print “Native American With Blonde Baby,” and the tintype “Sit!”
CAMBRIDGE — In his novel “The House of Seven Gables,” Nathaniel Hawthorne made a daguerreotypist the embodiment of the modern age. It was a sensible choice for 1851. Daguerreotypes, the first type of photograph, had been invented a scant dozen years before. They were as modern as modern could be, right down to the shininess of the silver-plated copper on which the image was exposed. Knowledge of that up-to-the-minute past makes the way daguerreotypes now look — as antique as an antimacassar covering a corset draped across a spittoon — all the more affecting as an evocation of time’s passage. Every photographic image is about the past. A daguerreotype doesn’t just state that pastness. A daguerreotype proclaims it.Daguerreotypes originated in France, named for their inventor, Louis Daguerre. Yet as “Daguerre’s American Legacy: Photographic Portraits (1840-1900) From the Wm. B. Becker Collection” amply demonstrates, the United States very quickly made them its own. The show runs at the MIT Museum through Jan. 4. Advertisement
The daguerreotype process was never patented. This insured that it soon spread to other countries. Then, as now, proprietary technology is its own speed bump. Two factors especially suited the daguerreotype to the United States (first introduced here in 1840). Where once portraits had been restricted to the rich and aristocratic, the daguerreotype made them available to anyone who could afford the relatively small expense. The daguerreotype represented a democratization of culture. What could be more American than that? Second, the greater degree of affluence in the United States meant that much of the population could pay for a sitting — or several. Within a decade, there were thousands of daguerreotypists in the United States. By 1853, they were taking 3 million daguerreotypes a year.The prevalence of daguerreotypes in US households made them no less prized by owners. So many of the daguerreotypes in “Daguerre’s American Legacy” are housed in handsome metal cases, with velvet-lined interiors, and elegantly curved mattes or borders. They’re displayed like Victorian reliquaries, with every man a saint and and every family holy. Their small scale makes them seem all the more precious. Daguerreotypes, being irreproducible, can’t be blown up in size as images produced by other photographic processes can. That scale also imparts a sense of intimacy that makes the images all the more affecting. Get Todays Headlines in your inbox:
There are more than 250 items in the show. Besides daguerrotypes, there are tintypes (a close photographic cousin), ambrotypes, and albumen prints (both of them products of slightly later processes). The show includes related items: cameras, lenses, handbills, even a storage box. One gets a sense of cultural context, of how large early photography loomed in the lives of average Americans.
DAGUERRE’S AMERICAN LEGACY: Photographic Portraits (1840-1900) From the Wm. B. Becker Collection
MIT Museum,
265 Massachusetts Ave.,
http://web.mit.edu/museum/exhibitions/daguerreotype.html
Average is an important word, as is anonymous. Don’t expect to find presidents and senators and men of letters in “Daguerre’s American Legacy.” The vast preponderance of sitters are unknown — another aspect of democratization. Who was the Lincolnesque-looking individual identified as “Close-up of a Man With Piercing Eyes”? We don’t know, other than that he certainly wasn’t Lincoln. Nor does th
is anonymity matter. With the help of Daguerre’s process these sitters have done something that only the most famous have done, defeat time.The photographers are mostly anonymous, too, though not always. There are examples in the show of work from Boston’s Southworth & Hawes, perhaps the most artistically distinguished of pre-Civil War American photographers, and the studio of Mathew Brady. But otherwise what we see is the product of a visual climate, and all the more valuable for that fact. The formal limitations of the daguerrotype make the variety of content on display here all the more striking — and valuable. To call “Daguerre’s American Legacy” a visual census would be far too sweeping, but not altogether inaccurate. The show gives a sense of just how large and how varied an increasingly large and varied country was becoming. Advertisement
There are also moments when the images can seem nearly as modern now as they once did then. The subject of Geo. H. Dresser’s albumen print “Native American With Blonde Baby” is just what it says it is. What the title doesn’t express is how effortlessly the smile on the woman’s face subverts any idea that America isn’t or can’t or shouldn’t be multicultural. As for the tintype “Sit!” (the title being a canine-directed command), it suggests that William Wegman’s great-grandfather knew how to operate a camera. Either that, or people who believe in reincarnation aren’t barking up the wrong tree. Mark Feeney can be reached at mfeeney@globe.com. | 科技 |
2017-09/1580/en_head.json.gz/2828 | Tablets to keep Linfox fleet rolling
Trevor Clarke
Linfox is set to roll out a new mobile machine-to-machine (M2M) solution for its fleet of vehicles across the Asia-Pacific region.Linfox trucks and road trains will have up to six proprietary in-cabin communications systems - potentially costing up to $12,000 in capital expenditure per vehicle - replaced by one communications hub. Drivers will use a ruggedised Motorola tablet or a smartphone running on Google's Android operating system to get access to applications.
Linfox will use machine-to-machine technology on its fleet. Photo: Stephen Moynihan
''We have just finished a trial on a device which gives us a communications hub that allows us to have satellite telephony, mobile telephone network and industrialised Wi-Fi in one,'' Linfox's president of supply chain solutions and chief information office, John Ansley, said. ''On top of that layer, we put some intelligence about some business rules.''These rules determine what vehicle information is automatically sent via M2M technology to the control centre depending on the vehicle's location and connectivity.''If you talk about our triple road trains that carry a lot of fuel, that costs us around $1 million to put them on the road,'' Mr Ansley said. ''You know you need a certain number of hours out of every day for that vehicle to make money. So when you are pulling information out of the management system and getting diagnostics that can indicate something is going to happen over the next two, three or four days and you can do preventative maintenance based on that - that is absolute gold. Your asset management improves out of site.''Linfox has used telematic systems, such as GPS, for tracking and managing its fleet of 5000 vehicles and supplies in its 250 warehouses for some time. The proposed solution is expected to cost between $300 and $5000 depending on vehicle and location. It will gradually replace older systems as trucks are refreshed over the next three to four years.
''We've been able to build applications that are specific to our drivers' needs and also our customers' needs. So with customer-specific apps, we can have information on there like documents that show, for instance, induction at a site they need to get to or the detailed road map of a mine site,'' Mr Ansley said.''The tablets sit in a cradle and once there, the functions are limited to push functions. The driver can't actually operate it if the truck is moving.''The tablets, however, can be used as a front end for the satellite phones, mobile phones, and entertainment system. They also have apps for signing documents such as receipt of delivery, along with still and video capabilities for capturing things such as damage to trucks for insurance and repair purposes. Photos are tagged with GPS co-ordinates and can be tagged with commentary.Telstra director of M2M and partners, Mike Cihra, said a handful of industries such as logistics, transport, utilities and security were already big users of M2M solutions. Other industries such as agriculture, automotive and hospitality were starting to explore their options and adopt M2M.Telstra's M2M revenue indicates burgeoning interest in the technology. It grew by 15.9 per cent year-on-year to $80 million in the year to June 2012. It was one of the telco's fastest growing segments but dwarfed by mobile broadband which is more than 10 times larger at over $1 billion and growing at 10.8 per cent annually.At end of June 2012, Telstra had 809,000 M2M mobile services in operation - adding more than 10,000 services a month for the past 12 months."M2M technologies are set to power a range of new intelligent applications in the future, these include metering and monitoring, surveillance, automotive and digital signage scenarios," said Telsyte research director, Foad Fadaghi."The main barrier to adoption has been standardised platforms and pricing, however both are improving substantially, allowing many more business to start utilising such technology." Follow ITPro on Twitter 13 comments
Telstra storing data on behalf of US government
Legal peer-to-peer movie sharing on the way
IT a support act in federal budget
Troubled myGov to be taken from Human Services Troubled myGov to be taken from Human Services Follow Canberra Times
Personal information of more than 190m American voters appears online
A touch of glass: how smartphones may take your temperature next
$16m cash injection a boon for Readify
The app that cost ACT budget $300,000 in lost revenue
One APS, 200 systems, hundreds of millions wasted | 科技 |
2017-09/1580/en_head.json.gz/2841 | Hurricane forecasters predict more Atlantic storms
Posted: Aug 09, 2012 11:47 AM ET
Ernesto drenches Mexico's southern Gulf coast The power and fury of tropical storms An Atlantic hurricane season that got an early start is likely to stay busy and may produce a few more storms than originally predicted, U.S. forecasters say.
Men stand on top of a marine fender that was ripped from a dock after Hurricane Ernesto made landfall overnight in Mahahual, near Chetumal, Mexico, on Aug. 8. (Israel Leal/Associated Press)
The updated forecast for the season that runs from June 1 to Nov. 30 says there are duelling weather conditions that could also tamp down activity later in the six-month stretch.
A total of 12 to 17 tropical storms were expected to form through the season overall, with as many as five to eight hurricanes, forecasters said. Two to three of those could become major hurricanes, which have top winds of 178 km/h or higher.
So far this year there have been four tropical storms and two hurricanes. One of those — Ernesto — made landfall late Tuesday as a Category 1 hurricane on Mexico's Caribbean coast near its border with Belize.
In May, forecasters had predicted nine to 15 tropical storms, with four to eight strengthening into hurricanes.
A normal Atlantic season produces 12 named storms, six hurricanes and three major hurricanes, based on a 30-year average, NOAA said. The 2011 hurricane season, one of the busiest on record with 19 named storms, produced Irene, one of the costliest storms in U.S. history.
El Nino could have moderating effect
Forecasters said warmer-than-normal sea surface temperatures and wind patterns that favour storm formation mean chances are higher for an above-normal season, while the expected development of the El Nino pattern over the Pacific may suppress storms later in the season's peak period.
The storm-feeding atmospheric and marine conditions are linked to an ongoing period that began in 1995 of high activity for Atlantic hurricanes, said Gerry Bell, the lead seasonal forecaster at NOAA's Climate Prediction Center.
Early-season activity in the deep tropics off Africa's coast, which produced Ernesto and tropical storm Florence early this month, also generally indicates a more active season, Bell said.
"Conditions are more conducive right now, but we expect them to become less favourable if El Nino develops as expected," Bell said.
The weather phenomenon known as El Nino, which is expected to form this month or in September, warms Pacific waters near the equator and increases wind shear over the Atlantic, tearing storms apart. The peak of the Atlantic hurricane season runs from August through October.
"In May, we were uncertain about El Nino's development and impact. Now we have a high confidence that El Nino will develop this month or next, but also that its influence will be delayed until later in the season," Bell said.
The Atlantic hurricane season got off to an earlier-than-official start this year when tropical storm Alberto formed May 19 off the South Carolina coast.
Forecasters name tropical storms when their top winds reach 63 km/h; hurricanes have winds of at least 119 km/h.
No major hurricane has made a U.S. landfall in the last six years, since Hurricane Wilma cut across Florida in 2005. This August marks the 20th anniversary of Hurricane Andrew's catastrophic landfall in South Florida as a Category 5 storm.
Laura Furgione, acting director of NOAA's National Weather Service, warned U.S. coastal residents not to be complacent about the risks of a hurricane striking their homes. Andrew was the first storm of a slow season that produced just six storms.
Ernesto weakened to a tropical storm as it drenched the Yucatan Peninsula, where it caused little damage. The storm spun into the southern Gulf of Mexico, crossing waters dotted with oil rigs operated by Mexico's state oil company, and was expected Thursday to bring torrential rains and flooding to Veracruz state's lush Los Tuxtlas region. © The Associated Press, 2012 Report Typo or Error | 科技 |
2017-09/1580/en_head.json.gz/2947 | Home > Computing > Google overhauling YouTube with channels, pro… Google overhauling YouTube with channels, pro content By
It’s been obvious for awhile now that Google has big plans for YouTube. It’s been inching closer and closer toward premium content and is even rumored to be considering designating some serious money toward celebrity–branded channels. And if all that weren’t enough, a new report from the Wall Street Journal says that Google is busy with a “major site overhaul to organize [YouTube’s] content around ‘channels.’”
As Internet TV makes its way into more living rooms, Google has found itself perfectly poised to compete with the likes of Amazon, Hulu, and Netflix for a piece of the streaming content pie. Of course, this will only be possible if it carefully, judiciously makes the transition from the mess of home movies, music videos, and cult content it currently supplies.
It’s about to begin. The report says that YouTube will now sport categorized channels (“such as arts and sports”), and that rumored premium-grade content is on its way. “YouTube is looking to introduce 20 or so ‘premium channels’ that would feature five to 10 hours of professionally-produced original programming a week,” an insider told WSJ. These changes are slated to roll in by the end of the year, and Google is looking for new recruits to work on the project. And Google isn’t simply working on adding organization to YouTube; meetings with reputable Hollywood talent firms would suggest it’s serious about nabbing some famous personnel, be those behind or in front of the camera.
YouTube certainly is enough of a household name to command at least some attention, and if it plans to keep and grow its user base and make the type of money it wants to off advertising, this is a natural step. With such massive change, there’s always the possibility of isolating passionate YouTubers who have been there since its start, but it’s probably worth it for Google to further break into our living rooms. | 科技 |
2017-09/1580/en_head.json.gz/2948 | Home > Computing > Windows 8 app store has 20K apps, but revenue is a… Windows 8 app store has 20K apps, but revenue is a different story By
Anna Washenko
One of the highlights of the Windows 8 OS is the expansiveness of the Windows Store. Microsoft built up the marketplace quickly to offer more than 20,000 apps within the first month of operation. Data from Distimo, a company tracking app analytics, showed that the daily download volume of the top 300 apps in the Windows Store is three times greater than the downloads of the top 300 Mac App Store selections. However, the important qualifier for that finding is that many of those Windows 8 downloads are free, meaning the store isn’t nearly as valuable as the Apple App Store. In fact, about 86 percent of the offerings in the Windows Store don’t cost a penny. Distimo provided other interesting insights about the Windows Store, noting that the available apps cater about evenly to tablets and PCs. It’s a big contrast to the App Store, which has just 13,000 selections for the Mac and a whopping 275,000 apps just for iPad. Microsoft has also pushed the local angle for its apps with the new store. More than 10 percent of those top 300 apps are popular in a specific geographic market. Just 65 percent of the Windows Store offerings are U.S. only, compared with 85 percent or more for other platforms. So while the money isn’t pouring in from apps, Microsoft has latched onto a strategy that seems to have a good chance for success. There’s an interesting split in the purpose of OS X/iOS apps and Windows apps. The Apple App Store has made a hefty chunk of change on games for iPhones and the iPad, whereas not only are the bulk of the Windows apps free, but they seem to have a different focus. Many of the top Windows 8 apps are either designed for productivity or easier use of your Windows 8 device, or they are apps tied to popular websites. It is a subtle indication that Microsoft machines and the people who use them still have a serious edge. Source: TechCrunch Next | 科技 |
2017-09/1580/en_head.json.gz/2970 | EGU - Awards & Medals - Runcorn-Florensky Medal - James W. Head
Dedicated to the pursuit of excellence in the Earth, planetary,
and space sciences for the benefit of humanity
About EGU
EGU Structure
Awards & Medals
List of Awards & Medals
Proposal & Selection of Candidates
Checklist for Submitting Nominations
Awards & Medals Committees
Early Career Scientists
James W. Head
Runcorn-Florensky Medal 2010
The 2010 Runcorn-Florensky Medal is awarded to
James W. Head for his outstanding work in studying volcanism and tectonism in the formation and evolution of planetary crusts and for developing remarkable US-European research collaborations in Earth and planetary sciences.
Jim Head is the Louis and Elizabeth Scherck Distinguished Professor of Geological Sciences. He came to Brown University in 1973, following his work with the NASA Apollo program, in which he analysed potential landing sites, studied returned lunar samples and data, and provided training for the Apollo astronauts. His current research centres on the study of the processes that form and modify the surfaces, crusts and lithospheres of planets, how these processes vary with time and how such processes interact to produce the historical record preserved on the planets. Comparative planetology, the themes of planetary evolution and application of these to the study of early Earth history are also of interest. He has followed up his research on volcanism, tectonism and glaciation with field studies on active volcanoes in Hawaii and at Mount St. Helens, on volcanic deposits on the sea floor with two deep sea submersible dives and during two field seasons in the Antarctic Dry Valleys.
Several research projects are under way in the field in Antarctica, on the Earth’s sea floor, and in assessing data from planetary surfaces to study climate change on Mars, volcanism on the Moon, Mars and Venus, the geology of the surface of Mercury and the tectonic and volcanic evolution of icy satellites. Since 1984, Head convenes the Vernadsky Institute/Brown University microsymposia, held twice a year in Moscow and Houston. He has served as Co-investigator in NASA and Russian Space Missions, such as the Soviet Venera 15/16 and Phobos missions, and the US Magellan (Venus), Galileo (Jupiter), Mars Surveyor, the Russian Mars 1996 and Space Shuttle missions. Head is presently a co-investigator for the NASA MESSENGER mission to Mercury and Moon Mineralogy Mapper (M3), as well as the European Space Agency’s Mars Express mission.
He has supervised a large number of Ph.D. students who have subsequently followed prestigious careers. He has published more than 430 refereed articles. He has received numerous prizes for his research activities and publications and has played a leading role in the development of collaborative planetary research between the US and Europe (West and East). | 科技 |
2017-09/1580/en_head.json.gz/3040 | European eShop daytime browsing restrictions dropped
Connor Sheridan
News No more waiting until 11 p.m. to download 18+ rated titles
Shares European Wii U and 3DS owners may now gawk at 18+ rated eShop titles whenever they darn well feel like it. Eurogamer reports that Nintendo of Europe has finally removed the restrictions which allowed potentially objectionable content to be perused only between 11 p.m. and 3 a.m., regardless of the user's registered age.The restriction was meant to follow German media regulations but, according to this message posted to the eShop, even the regulators felt it was a bit over the top:"Following analysis of the Parental Controls system on Wii U and Nintendo 3DS in cooperation with USK, the German Entertainment Software Self-Regulation Body, it was deemed that Nintendo's Parental Control system is of very high quality and offers a remarkable level of protection for children. Nintendo's Parental Control system was found to have proved itself in practice."Try not to fall too far into hedonistic cycles of browsing ZombiU screenshots in the daylight.
One of GR+'s news crew, Connor also writes features from time to time and does a lil' streamin'-streamin' on the side. Chrom is his husband and nothing will ever come between them.
Nintendo of Europe | 科技 |
2017-09/1580/en_head.json.gz/3043 | PS Vita: Rumored March iPad 3 release 'doesn't concern' Sony
The PlayStation Vita officially releases today worldwide, opening up an entirely new gaming experience for mobile gamers. Of course, when you hear "mobile gaming", the average mind doesn't necessarily think PlayStation Vita, however. Instead, it likely goes to Apple and their iPhone and iPad, which have dominated the mobile gaming market.
This is part of the problem for Sony, and with Apple rumored to announce the iPad 3 in March of this year, it could pose a challenge to Sony who hopes the PS Vita will fill a void for those looking to take their gaming experience on-the-go. Specs for the iPad 3 are nothing more than rumor at this point, but early rumors indicate the iPad 3 will be a much more powerful mobile gaming rig than the previous iPad 2. With that being said, Gamasutra asked Sony's senior vice president of its Worldwide Studios, Scott Rohde, if this or the possibility that Apple could release a new iPad every year concerns him or Sony.
"Well, I don't think it concerns me at all. And you have to understand, like I said earlier, I'm a fan of all these devices. I really honestly am," Rohde said. "I'm an iPad owner, and that's not something that I'm ashamed to admit, of course. And I'm also telling you that as soon as I got my launch edition of the PlayStation Vita, that iPad is absolutely gathering dust."
Rohde attributes the dust gathering to the PlayStation Vita offering a "totally different kind of premium experience."
"My iPad is now relegated to a handful of emails here and there, and when I want a gaming experience I am going to pick up my Vita, so it doesn't concern me if the iPad has more horsepower or something along those lines, because the PlayStation Vita is specifically built with gamers in mind, and the iPad is not. It's a multifunctional device."
Rohde's comments actually echo those of Shuhei Yoshida, president of worldwide studios for Sony Computer Entertainment, who told Venturebeat the Vita would attract a different kind of audience. "We’re targeting people who really want to play games, and also would like to have cool gadgets," Yoshida explained. "We believe that once we create some very unique and strong experiences, people will find the value in having another device."
Yoshida added, "Even if you may have an iPad, you might be convinced to have another device that plays games really well and does other stuff that you do every day as well. You might decide, when you’re on a trip or going to school, you could choose to have the iPad, but it’s a bit bigger or heavier, while you can put the PS Vita in your jacket pocket."
Sony has already made it clear they are targeting a specific demographic. Their estimated $50 million "Never Stop Playing" marketing campaign for the Vita targets specifically "men in their 20s who play video games eight hours a week or more and own a PS3 console."
It's a specific demographic, indeed, and perhaps the reason why Rohde believes "there's a place for both of these devices in our world, and I firmly believe that is true."
Three million iPads sold in just three days
Apple unveils small, but expensive 7.9-inch iPad Mini, fourth generation iPad
Strong iPad sales lead to $8.8 billion profit in Q3 for Apple, as investors worry over iPhone
Playstation Vita and Nintendo 3DS vs Smartphones and Tablets
MIDIConnect allows you to record that sweet keyboard sound straight to your iOS device | 科技 |
2017-09/1580/en_head.json.gz/3073 | Brownsville, Texas, Gets the WiMAX
Brownsville found its safest option for a new wireless network was also the most cutting-edge.
by Patrick Michels
/ December 12, 2007
Never the sort of place to go along with the crowd, the seaside border city of Brownsville is taking a novel approach in its new citywide wireless network, leaving Wi-Fi hubs to households and coffee shops, and building its own system on powerful, but largely untested, Worldwide Interoperability for Microwave Access (WiMAX) technology. As the city unveils the system this fall, Brownsville will be joining just a handful of cities around the country to buck the Wi-Fi trend and put their money behind WiMAX. City officials say they made the potentially risky choice not in the interest of being early adopters, but because it promised the security and stability they were looking for. "I think it was a better choice for the city, which is what we asked IBM for: Give us what's best for Brownsville, not what everybody else is doing," said Gail Bruciak, Brownsville Management Information Systems director. Long-touted as the "next big thing" in wireless, WiMAX has dazzled industry watchers with its promised improvements over Wi-Fi: stronger signals, greater information-carrying capacity and a range of more than 30 miles from a single transmitter. That extra range means cities can cover their entire area with just a few towers, akin to cellular transmitters, instead of countless little Wi-Fi nodes scattered around town. WiMAX technology uses licensed frequencies around 2.5 GHz, and works on an 802.16 standard set by the Institute of Electrical and Electronics Engineers, which developed the 802.11 standard on which Wi-Fi is based. Because WiMAX uses signals at licensed frequencies not open to the public, the city can count on its data moving quickly and privately without interference. By depending on three strong towers for the WiMAX signal, Brownsville emergency services will have access to the network when foul weather might disable other types of wireless infrastructure. Operationally DrivenWhile the most publicized municipal wireless networks are built to provide Internet access for the general population, Brownsville's WiMAX network is mostly for carrying the data load for official city business. The WiMAX system, new servers and data storage, and a Web-enabled software suite, are all part of a sweeping, $4.2 million deal with IBM to revamp Brownsville's technology. The new equipment will create a much-needed change in the city's IT infrastructure, Bruciak said. "We're running everything for the city, with the exception of emergency services, on an HP3000 from 1994. The software is COBOL - it's old. Of course, there's no maintenance for it, we can't get parts for [the system] anymore," she said. "So we knew we had to migrate off." IBM Project Manager Sean Guy said the grand scope of the Brownsville project is what makes it so unique. "This effort is not simply a wireless deployment," he said, calling it a "complete refresh of the core IT infrastructure." The city contracted with IBM for system design and deployment, but local service provider Rioplex Broadband will maintain the network. The five-year deal with Rioplex brought the total cost of the project to $6.6 million. "It's relatively inexpensive," Bruciak said, pointing out that with the WiMAX equipment and signal towers, the city avoids the cost of buying and installing Wi-Fi nodes around the city. To pay for the project, Bruciak said the city relied heavily on grant funding - and the creativity it takes to tap the right funding sources. "Basically we went out and borrowed money to get this," she said, funding much of the project through the city's annual capital projects bond issue. Beyond that, Bruciak said the city cast a broad net in its grant applications. "We started with the tech grants, and they sort of dried up. Then we looked at homeland security, but it's so hard to get that sort of funding," she said. The city struck out in Patrick Michels
| Contributing Writer
Patrick Michels is based in San Francisco and Austin, Texas. He writes for Government Technology, Texas Technology and Emergency Management magazines. | 科技 |
2017-09/1580/en_head.json.gz/3134 | Sony FE 24-70mm f/4 ZA OSS Zeiss Vario-Tessar T* SEL2470Z
FE-Mount
Stabilzed
Sony A7R
Buy the Sony FE 24-70mm f/4 ZA OSS Zeiss Vario-Tessar T* SEL2470Z
March 28, 2014by William Brawley
The Sony FE 24-70mm ƒ/4 ZA OSS Carl Zeiss Vario-Tessar T* was one of the two new zoom lenses announced alongside the full-frame Sony A7R and A7 mirrorless cameras. This Carl Zeiss-branded lens features a very versatile 24-70mm focal length range making it suitable for landscapes, portraits and pretty much everything in between. The lens includes a variety of high-end features like Sony's Optical SteadyShot image stabilization technology, Carl Zeiss' T* coatings for reduced flare and boosted contrast, and a rugged build quality that's resistant to dust and moisture.
This compact full-frame zoom lens ships with a bayonet-style lens hood, front and rear caps and a soft pouch. The Sony FE 24-70mm ƒ/4 ZA OSS Carl Zeiss Vario-Tessar T* is currently available for purchase for a retail price around $1,198 - (Adorama, Amazon, B&H).
Despite carrying the Zeiss branding, which is typically indicative of high-end results, we felt that the Sony FE 24-70 Zeiss lens fell a little short of our expectations. At 24mm, the lens displays good sharpness right in the center of the frame, even wide open, but outwards and especially in the deep corners it's noticeably soft. Surprisingly, even stopping down doesn't improve the corner softness at 24mm, and by ƒ/16-ƒ/22, diffraction comes into play and reduces sharpness all around even more.
Zooming out to 35mm and 50mm improves sharpness significantly, especially in the corners, and throughout the aperture range (until diffraction hits at ƒ/22). However, as we saw at 24mm, 70mm on the Sony FE 24-70 displays a decently sharp center, but with considerably softer corners that aren't much improved by stopping down. Surprisingly, corners at 70mm appear even softer than they do at 24mm.
All in all, we were a little surprised to see a Zeiss-branded lens be as soft in the corners as the Sony FE 24-70mm f/4 was. It's perhaps important to note, though, that we base our testing on RAW file output, so it's very likely that Sony cameras will apply significant in-camera correction to the JPEG images, resulting in much more uniform sharpness.
The Sony FE 24-70 Zeiss lens does quite well in controlling chromatic aberration. At all focal lengths, and all apertures, the lens displays very low CA on average. The lowest amount of CA is at 50mm, which hovers just above zero at ƒ/4. We saw a little up-tick in average CA at 70mm ƒ/4, but it was still very minor, and something that would easily be alleviated by in-camera JPEG processing or quick adjustment in your favorite photo editing program.
If you're a fan of vignetting, you're in luck, as the Sony FE 24-70 shows a lot of it, particularly at 24mm. Wide open at 24mm, the lens shows over 1.5 stops of light loss, and even by ƒ/5.6, it's still over 1 stop. Stopping all the way down to ƒ/16-ƒ/22, vignetting still shows over 0.5EV of light falloff.
The longer focal lengths fair better, but the three other focal lengths we tested (35mm, 50mm and 70m) all display significant vignetting from ƒ/4 until around ƒ/8. Stopping down past ƒ/8 still shows moderate vignetting at these focal lengths. Interestingly, 70mm shows the next highest amount of vignetting when shot wide-open at just under 1 EV of light loss.
Shading is quite easy to correct for, so here again, JPEG results on Sony cameras could be considerably better than what we're seeing in the RAW files.
The geometric distortion characteristic of this lens is rather complex. On average (meaning, affecting most of the frame), there's barrel distortion to a varying degree at all focal lengths, even the more normal and telephoto focal lengths of 50mm-70mm. At 24mm, barrel distortion is the strongest, between 0.5-1%, while 35mm shows the least, at around 0.2%. However, when you look at the maximum distortion values, which generally show what's going on around the edges of the frame and in the corners, we see over 1% barrel distortion at 24mm, yet close to -1% pincushion distortion at 35mm and even more pincushion distortion at 50mm and 70mm (almost -1.5%). At the risk of sounding repetitious, distortion is also something that's subject to in-camera correction, and in this case would be better done there than in Photoshop or Lightroom, given its complex character, with both pincushion and barrel distortion being present within the frame at the same time.
NOTE: We normally ignore in-camera JPEG results, because the in-camera processing introduces too many variables to permit good comparisons between lenses on different platforms. Given the extent of corner softness, shading, and distortion we've found in the Sony FE 24-70mm f/4 lens, though, we're going to take a look at some camera JPEG images captured along with the RAWs used for this analysis, to see to what extent in-camera correction addresses some of these issues. Stay tuned for an update on that within the next week or so.
The Sony FE 24-70 Zeiss lens autofocuses very quickly, taking under one second to focus through the whole range. Nice and fast. It's also extremely quiet. It should be noted, however, that by default, the Sony A7R and A7 cameras have a "Pre-AF" setting enabled, which makes the camera continually focus, which can help speed up focusing as the lens may already be near the appropriate focus distance for whatever you're shooting. This setting can be disabled, though, as we found it can be distracting as well as impact battery life.
Manual focusing is also available on this lens, though like other Sony FE lenses, it uses an electrical focus-by-wire system. The focus ring on the lens will therefore rotate smoothly, but indefinitely with no stops at minimum and infinity focus distances. Also a consequence of the focus-by-wire design, there are no focus distance markings or window on this lens. Macro
The Sony FE 24-70 Zeiss has a close focusing distance of 1.32ft (40cm) with a maximum magnification of 0.2x (1:5 ratio), which doesn't make it a great lens for macro photography.
Like the other Sony FE lenses released so far, the Sony FE 24-70 Zeiss has excellent build quality. It has a solid feel and slight heft, while still being compact, lightweight and relatively well-balanced on the A7/R cameras (there's a bit of front-heaviness to it, but it's not awkward to use by any means). The zoom and focus rings rotate very smoothly with just the right about of dampening -- not too stiff but not loose either to cause concern for lens creep. The lens extends slightly when zooming making it about 1.25 inches longer at 70mm. The front element does not rotate during focusing or zooming making the lens suitable for filters such as circular polarizers (with 67mm filter threads). Keeping with a similar design aesthetic to other Sony FE lenses, the Sony FE 24-70mm Zeiss lens features a metal barrel with a very nice, smooth, matte black finish, and fine-grained ribbed textures on the focus and zoom rings. The all-metal construction is designed to protect against dust and spray, but it is most likely not fully weather-sealed (i.e. there's no rubberized gasket around the mount). Inside the barrel sit 12 elements in 10 groups, including 5 aspheric and 1 extra-low dispersion elements. The 7-bladed, rounded aperture diaphragm makes out of focus areas pleasing and smooth. When it comes to buttons, rings or other exterior features, like the FE prime lenses, the FE 24-70 is fairly sparse with its zoom and focus, and not much else. Since the lens is a focus-by-wire system, there's no focus distance window built-in to the lens, nor is there a manual AF/MF toggle switch or an On/Off switch for the Optical SteadyShot system either -- those are controlled via the camera. Alternatives
Sony does offer an alternative to the pricey $1000+ Zeiss-branded 24-70mm ƒ/4 lens -- the Sony FE 28-70mm ƒ/3.5-5.6 OSS. This zoom lens not only provides a similar range of focal lengths -- thought not as wide -- but also built-in image stabilization. You do, however, loose the constant ƒ/4 aperture as well as the all-metal construction (that lens is mostly plastic, though it does have a metal lens mount like its Zeiss sibling). The big plus is that it's significantly more affordable with a retail price just shy of $500.
However, seeing as the Sony FE mount is the new kid in town, there aren't many third-party alternativse yet. However, with Sony's own LA-EA3 A-mount to E-mount adapter (no AF) or LA-EA4 adapter (adds Sony's Translucent Mirror technology and autofocus), you could use Sony A-mount lenses, including the large and heavy Sony 24-70mm ƒ/2.8 ZA Carl Zeiss Vario Sonnar T*, which features a much brighter constant ƒ/2.8 aperture, but no image stabilization and a whopping $1,800 price tag. There's also the Sony 28-75mm ƒ/2.8, which sacrifices a bit of the wide end and adds a slight increase in focal length at the tele, as well as having a constant ƒ/2.8 aperture. However, again, you don't get image stabilization, but you do have a slightly more affordable price at around $900.
Now, the smaller flange distance of the Sony A7R and A7 makes using adapters with a wide variety of third party lenses quite simple, as the A7R/A7 use a standard Sony E-mount. There are other adapters out there allowing users to mount Canon EF lenses and Nikon F-mount glass to E-mount cameras, and the resulting combinations would be too much to list here. It's certainly worth knowing that the Sony A7R and A7 provide a very flexible full-frame camera platform for fans of lenses from a variety of manufacturers. Conclusion
After seeing superb results from the two Sony/Zeiss FE primes, the 55mm ƒ/1.8 and 35mm ƒ/2.8, we had high hopes that the Zeiss-branded 24-70 ƒ/4 lens would be another excellent piece of glass. In terms of build quality, the Sony FE 24-70 Zeiss is indeed excellent, with an all-metal construction and that characteristic Sony and Zeiss matte black design. When it comes to RAW-file image quality, however, we found ourselves a little disappointed. With soft corners at essentially all focal lengths (and even stopped down), plus strong vignetting and distortion, the Sony FE 24-70mm ƒ/4 ZA OSS Carl Zeiss Vario-Tessar T* didn't meet our expectations. It's possible that default corrections applied in-camera to the JPEG files may counteract some of these issues; we'll be investigating that possibility a bit further, and will report back here with the results.
For fans of landscape photography who demand sharp images corner-to-corner and low vignetting, and who primarily work from RAW images, this may not be the lens for you. And even for RAW shooters doing general-purpose photography, for which a 24-70mm lens is very useful, the Sony FE 24-70 Zeiss lens comes with a hefty price tag that doesn't deliver the kind of performance we'd expect. For JPEG shooters, stay tuned for an update here, after we've had a chance to do some analysis on some in-camera JPEG images.
Thankfully, the Sony A7/R cameras are very flexible when it comes to adapters, so fans of wide-angle and general-purpose focal lengths aren't left out in the cold as lenses from Sony's A-mount lineup or even other manufactures can easily be used. And there are always the excellent Sony FE primes as another, albeit slightly less versatile, option for the Sony A7/R photographer. Product Photos
Note: We only had this lens in-house for a brief time, and as such, we were only able to fully test it on the Sony A7R. However, we have included sub-frame sample images from the Sony NEX-7.
Check out some sample photos shot by our senior lens technician Rob Murray. You can view more sample photos, plus download the full-resolution files, over at our Flickr page.
FE 24-70mm f/4 ZA OSS Zeiss Vario-Tessar T* SEL2470Z
Sony E-mount - Black
Sony FE 24-70mm f/4 ZA OSS Zeiss Vario-Tessar T* SEL2470Z User Reviews
lightknight
Small, quite light, very well built and very sharp
F4...
I bought this lens despite the reviews. All I can say is that on my A7R sharpness is super and I see very little difference between F4 and F8 in the corners
reviewed January 15th, 2017
photographystudio101
smallish
smooth zoom ring
quality materials, feels solid
huge distortion
some chromatic aberration
pricey lens
I purchased this lens last year after hearing some negative reviews and some positive reviews. Needed one and had no choice but to get this lens. I got it after Nov.2015 and I took it overseas to New Zealand, Australia, Japan, Korea and China and the US. This lens performed very well for me and it was very sharp and I have many great shots over a span of 3-4 months. Its a keeper for me. I don't know why this lens did not perform very well for some reviewers except that its possible they received a bad copy. The most negative review of this lens was the distortion which I agree on. The distortion is very disturbing but can be corrected in Lightroom and poses no problems for me. In fact, I enjoy post processing my images. The feel of this lens is excellent. It's well made no doubt, and just feels great to use in the hands. the zoom ring is silky smooth and the OSS works fantastic. It does weigh a little but nothing compared to other 24-70 f4's. The color rendition is superb and the dynamic range with the A7rII is just unbelievable. The only down side is its price, at around $1100.00, its one of the most costly 24-70 f4's out there. I felt a 24-105 f4 would have been a better lens for travel and roundabouts, but there's non available yet except with the use of adapters which haven't proved to be too successful with other mounts yet. This lens is a keeper for me as long as I have the A7r2II in my hands.
reviewed April 25th, 2016
glenwells
24mm wide angle excellent 50mm
soft in corners at 24mm f4 to f5.6
Almost bought this lens twice before but put off by mixed reviews, especially early ones when the lens was £1200!!
Just bought one as of Jan 2016. Prices have come right down - new £539 after shopping around. Better value/per performance balance at that price.
I have the 28-70mm kit lens that came as a kit which was £775 all in so not too fussed about upping to the Zeiss.
The Zeiss 24-70mm lens then - build quality is very good it is a little bit larger than the 28-70mm but I guess you got to squeeze in the extra 4mm at the wide end and accommodate the F4 at the tele end. I believe that my 'copy' of the lens is as good as can be expected with no issues that could be blamed on poor QC.
I have to disagree with the poor sharpness test results here, mine is more as the one scored on DXO when viewing the 'field map' view but with a much better 70mm end - was their copy duff?.
Ditortion is there in RAW which brings its own issues but I have Adobe RAW set to auto correct for me a not a problem. Others will debate loss of detail due to the corrections needed on jpg but many lenses do that in camera now.
The lens is great from wide open at 35mm and 50mm impressive IMO.
At the 24mm end the corners are soft until f5.6 where they are fine and they keep improving up to about F11ish. The 24mm corner softness has been raised loudly in the reviews that I have read, I can appreciate that especially when the lens was selling at £1200. For me the soft corner at 24mm F4 is less of an issue as when I am using that combination I am usually focusing on a subject and its surrounding will be blurred off by brokeh anyhow. I guess there are pelnty of brick walls that will look terrible at 24mm F4 in the corners (joke).
I have had many camera systems and different lenses over the years and even some 'quality' primes I have owned have been soft in the corners at the widest aperture needing stopped down a bit to make them shine.
The lens is better than the kit lens at all focal lengths and at all apertures. I am glass that I shrugged off the negatives and bought it. It has not been off the camera since I got it.
(purchased for $539) 7 out of 10 points and not recommended by
size, build quality
It's a small and light lens and it's build quality is good. But that's about it talking about the pros.
Optically this lens is a disaster. Corners never get really sharp (not matter which focal length or aperture you use), it suffers from chromatic aberration and distortion as well as vignetting. If you don't need the 24mm wide angle I'd say go for the 28-70
reviewed December 7th, 2015
(purchased for $1,000) 5 out of 10 points and not recommended by
Well built, nice range and f4 keeps the weight down
Optical quality is sub-par, especially for the price.
There's a lot to like about this lens, build, range, weight, but unfortunately optical quality is not one of them. If you like centre sharp, then it's reasonable, but the corners are just plain soft at all focal lengths. I cannot recommend a lens that performs so poorly. Chromatic aberration is bad in the corners as well, not that it's difficult to fix, but for what I consider a premium lens, it's poor performance. Put this on an A7R or A7R II, and it's embarrassing. Unfortunately, I bought it based on the opinion of someone I thought knew what they were doing. Even though the range is nice, and Sony's lens lineup is sparse, I would avoid this lens. It's just not very good.
reviewed September 22nd, 2015
andre_
Weather resistent, good optical performances, very good OSS
Distorsion correction is mandatory, no distance on the barrell
Great lens for street and travels, perfect companion of the A7. Lightweight, weather resistent (despite it lacks the internal zoom), this lens has very good optical features, with well corrected aberrations and only some softness on the corner at f4 (but I don't care...). THe bokeh is close to some primes' (the out of focus), and the general rendition is almost the same at landscapes distance and closer. The autofocus is quite quick and very precise (on the A7), and the vibration reduction system is efficient and silent. All of this without much increment in the battery drain, in comparison with the use with manual lenses. I usually use this 24-70 without the hood, and I don't remember a single case of flare.
As in every review, the corners are soft at f4, but it isn't an issue neither for street photography nor for landscapes, since at f5.6-8 the rentition is linear. Maybe the definition is a little less at 70mm, but the 24Mp of the A7 sensor are well resolved (perhaps it could possibly suffer a little on the A7r).
The only real issue I've experienced is the distorsion, along the whole zoom range. As I wrote, it's mandatory a distorsion correction (built-in for JPG), but the Lightroom profile works well on the RAW files.
Highly reccommended!
rponiarski
Solid construction, great contrast and colors
Everyone panned this lens and I did not know what to think, but after seeing the quality of the Sony/Zeiss 55mm, I took a chance and bought this lens as my walk around lens for my A7m2.
All in all, it is a great lens. Yes, $1200 is a lot for an f4 zoom lens, but it has typical Sony/Zeiss quality. The micro-contrast and colors are excellent, it is sharp from f4 in the center with just a little softness on the edges, which is really only of interest in pixel peepers. For street photos and general landscapes, this is an excellent lens, well made and feels like it is a quality piece, which it is. Flare is handled very well and the OSS works really well. Highly recommended!
(purchased for $1,200) Write your own user review for this lens! | 科技 |
2017-09/1580/en_head.json.gz/3176 | Deep Impact Arrives in Florida to Prepare for Launch
First Close Encounter of Saturn's Hazy Moon Titan
Mars Exploration Rover Mission Status
A problem that affects the steering on NASA's Mars Exploration Rover Spirit has recurred after disappearing for nearly two weeks. Engineers at NASA's Jet Propulsion Laboratory, Pasadena, Calif., are working to fully understand the intermittent problem and then implement operational work-arounds. Meanwhile, Spirit successfully steered and drove 3.67 meters (12 feet) on Oct. 17. Rover engineers are also analyzing a positive development on Spirit's twin, Opportunity: a sustained boost in power generation by Opportunity's solar panels. Both rovers have successfully completed their three-month primary missions and their first mission extensions. They began second extensions of their missions on Oct. 1. Rover engineers refrained from driving Spirit for five days after an Oct. 1 malfunction of a system that prevents wheels from being jostled in unwanted directions while driving. Each of the front and rear wheels of the rover has a motor called a steering actuator. It sets the direction in which the wheel is headed. The steering actuators are different from the motors that make the wheels roll, and hold the wheel in a specific direction while driving. A relay used in turning these steering actuators on and off is the likely cause of the intermittent nature of the anomaly. The relay operates Spirit's right-front and left-rear wheels concurrently, and did not operate as commanded on Oct. 1. Subsequent testing showed no trace of the problem, and on Oct. 7, the rover steered successfully and drove about 2 meters (7 feet), putting it in position to examine a layered rock called "Tetl" for several days. However, the anomaly occurred again on Oct. 13, and the problem appeared intermittently in tests later last week.
"We are continuing tests on Spirit and in our testbed here at JPL," said Jim Erickson, Mars Exploration Rover project manager at JPL. One possible work-around would be to deliberately blow a fuse controlling the relay, disabling the brake action of the steering actuators. The rovers could be operated without that feature. "The only change might be driving in shorter steps when the rover is in rugged terrain," Erickson said.
Spirit has driven a total of 3,647 meters (2.27 miles) since landing, more than six times the distance set as a goal for the mission. Its current target is a layered rock called "Uchben" in the "Columbia Hills." Opportunity has driven 1,619 meters (just over a mile). Its latest stop is a lumpy boulder dubbed "Wopmay" inside "Endurance Crater."
The daily power supply for each rover comes from 1.3 square meters (14 square feet) of solar panels converting sunlight into electricity. Just after the landings in January, the output was about 900 watt-hours per day for each rover -- enough to run a 100-watt bulb for nine hours. As anticipated, output gradually declined due to dust buildup and the martian seasonal change with fewer hours of sunlight and a lower angle of the Sun in the sky. By July, Spirit's daily output had declined to about 400 watt-hours per day. It has been between 400 and 500 watt-hours per day for most of the past two months. Opportunity, closer to Mars' equator and with the advantage of a sunward-facing tilt as it explored inside the southern half of a crater, maintained an output level between 500 and 600 watt-hours per day in June, July and August. Since early September, the amount of electricity from Opportunity's solar panels has increased markedly and unexpectedly, to more than 700 watt-hours per day, a level not seen since the first 10 weeks of the mission.
"We've been surprised but pleased to see this increase," said Erickson, "The team is evaluating ways to determine which of a few different theories is the best explanation." Possible explanations under consideration include the action of wind removing some dust from the solar panels or the action of frost causing dust to clump. "We seem to have had several substantial cleanings of the solar panels," Erickson said.
JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Exploration Rover project for NASA's Science Mission Directorate, Washington. Additional information about the project is available from JPL at http://marsrovers.jpl.nasa.gov/ and from Cornell University, Ithaca, N.Y., at http://athena.cornell.edu.News Media ContactGuy Webster (818) 354-6278
Jet Propulsion Laboratory, Pasadena, Calif. Don Savage (202) 358-1727
NASA Headquarters, Washington
2004-261 Related Links+ Latest images+ Rover home page
+ Athena home page
+ Mars wallpaperPopularIn Atmospheric River Storms, Wind Is a Risk, TooNASA to Host News Conference on Discovery Beyond Our Solar SystemNASA's Europa Flyby Mission Moves into Design PhaseDawn Discovers Evidence for Organic Material on CeresNASA's NEOWISE Mission Spies One Comet, Maybe TwoNASA Study Solves Two Mysteries About Wobbling Earth You Might Also Like
A mission to examine the habitability of Jupiter's ocean-bearing moon Europa is taking one step closer to the launchpad, with the recent completion of a major NASA review.
NASA's Juno mission to Jupiter, which has been in orbit around the gas giant since July 4, 2016, will remain in its current 53-day orbit for the remainder of the mission.
NASA's Juno to Remain in Current Orbit at Jupiter
How a puzzling sensor reading transformed NASA's Cassini Saturn mission and created a new target in the search for habitable worlds beyond Earth. | 科技 |
2017-09/1580/en_head.json.gz/3238 | LightSquared misses $56M payment to Inmarsat for spectrum
updated 03:25 pm EST, Mon February 20, 2012
Could lose spectrum most needed for GPS workaround
LightSquared has failed to make a $56.25 million payment to UK satellite operator Inmarsat. The payment was to be the first of several payments, totaling $175 million for the year to be payed by LightSquared to Inmarsat, in a deal struck between the two in 2007, for access to spectrum. If LightSquared doesn't pay, Inmarsat can legally end the deal.Last week, LightSquared's efforts to build out its LTE network were put on a possibly permanent hold after a decision by the FCC that LightSpeed's LTE transmissions interfered with GPS signals used by consumers, the military and air traffic. Inmarsat's bandwidth is especially critical to LightSquared because it runs over blocks of the spectrum least likely to cause interference with GPS.
LightSquared has denied that it's responsible for creating interference and has gone so far as to accuse government agencies of allowing vested interests to rig findings to protect the GPS industry. Talk has existed of the company possibly suing the FCC and other agencies, although this likely wouldn't come in time to avoid a mid-March deadline for getting help from Sprint. [via PCWorld] TAGS | 科技 |
2017-09/1580/en_head.json.gz/3263 | Toy Soldiers: War Chest lands next year on PC, PS4 and Xbox One
Tuesday, August 12th 2014 at 12:20AM BST
Signal Studio’s Toy Soldiers series has been picked up by publisher Ubisoft, with the two companies announcing Toy Soldiers: War Chest for release in 2015.
The new partnership will allow the franchise to make its PlayStation debut, as Signal Studio’s previous two Toy Soldiers titles had been published by Microsoft and tied to Xbox 360 and PC platforms.
"[In War Chest] gamers of all ages can play as and face off against toys familiar to them from their childhoods to recreate the battles from their imaginations," Signal’s founder D.R. Albright said. "Fans will have a great time learning how each army plays and the individual strengths of their troops and weapons."
Albright added that the developer’s new deal with Ubisoft allows the studio to create the game it has “always envisioned.”
Toy Soldiers: War Chest will launch at some point next year on PC, PS4 and Xbox One.
, Toy Soldiers: War Chest
, Signal Studios
New Ghost Recon Wildlands trailer shows heads exploding while cat chases red dot Jan 6th 2017 at 10:40AM
Minecraft console update means the game is now almost the same as the PC version Dec 22nd 2016 at 11:17AM
Microtransactions added to Forza Horizon 3 Dec 21st 2016 at 10:48AM
MCV's Review of the Year - Part 3: Events of 2016 Dec 15th 2016 at 12:00PM
Nearly 50% of Xbox One owners have used backwards compatibility Dec 14th 2016 at 9:51AM | 科技 |
2017-09/1580/en_head.json.gz/3298 | You are hereHome › News › FSTC › Turkish-American TV: Interview with President of FSTC
By: The Editorial TeamInterview with Prof. Salim Al-Hassani at 1001 Inventions Exhibition in National Geographic Museum by Turkish-American TV
+ Click to read the full article- Click to closeInterview with Prof. Salim Al-Hassani at 1001 Inventions Exhibition in National Geographic Museum by Turkish-American TV
1001 Inventions Exhibition at the National Geographic Museum from Turkish-American TV on Vimeo.
The award winning "1001 Inventions" exhibit held at the National Geographic Museum reveals the ancient Muslim world's ground-breaking contributions to science and technology and how its influence extends into the present day. "1001 Inventions" has gathered much attention from a wide audience in several cities, including Istanbul, Turkey as Prime Minister Tayyip Erdogan's request that this exhibition be displayed there. Over the course of 7 weeks, the exhibit received 450,000 visitors in Sultan Ahmet Square. Professor Salim Al-Hassani, President for the Foundation for Science, Technology and Civilisation and the Chief Editor of the book "1001 Inventions", remarked that he saw many young Turks cry from joy upon seeing their ancestors' brilliant contributions to history while stating: "This is fantastic… we now feel that we have respect; we have appreciation from the rest of the world."
Professor Al-Hassani emphasizes that the idea for compiling the knowledge behind this exhibit was born 13 years ago after a professor and colleague of his stated that 1,000 years of history, otherwise known as the dark ages in the West, had been forgotten. Professor Al-Hassani prefers to refer to this period as the "Golden Age", as many pioneers from the Muslim world significantly changed the course of history for the better during this time. He underscores how this exhibit has helped many people worldwide to recognize their integral role in civilization, as their ancestors were its "builders."
The exhibit is an interactive and entertaining manifestation of knowledge compiled by the Foundation for Science, Technology and Civilisation and 1001 Inventions Ltd. It counters the widely yet mistakenly held notion that the exotic backdrop of "One Thousand and One Nights," a series of Arabic folk tales, is an actual depiction of ancient Islamic civilization by playing with the story's title and providing a rich array of information based on ancient manuscripts and other findings. Turkish-American TV is a proud supporter of 1001 Inventions and provides a glimpse into the interactive and educational nature of this exhibit while including informative interviews with Professor Al-Hassani and Richard McWalters, Director of National Geographic's Museum Operations.
For more information please visit www.1001inventions.com/dc.
NewsFSTCRate this article: Select ratingGive it 1/5Give it 2/5Give it 3/5Give it 4/5Give it 5/5
FSTC at Second International Conference in Sharjah
FSTC Newsletter Issue 11: January 2015
FSTC Newsletter Issue 10: January 2013 The York Society of Engineers: A Lecture by FSTC Chairman, 6th December, York, UK
FSTC Newsletter 9 - August Issue 2012
Mosul the Pearl of Northern Iraq: Its History and Contribution to Classical Civi... Mosul, in Northern Iraq, is the country's second largest city and the north's... Ibn al-Majdi Shihab al-Din Abu al-'Abbas Ahmad Ibn al-Majdi (1359–1447 CE) was an Egyptian... Lecture on Timbuktu Manuscripts at Al-Furqan Foundation Al-Furqan Islamic Heritage Foundation organizes on Wednesday 6th March 2013,... Thabit ibn Qurra Abu al-Hasan Thabit ibn Qurra al-Harrani al-Sabi (born in Harran, now in... Nasir al-Din al-Tusi Muhammad ibn Muhammad ibn Hasan al-Tusi (born in 18 February 1201 in Tus,... Pages | 科技 |
2017-09/1580/en_head.json.gz/3305 | Quantum dots to deliver and monitor molecules at the cellular level
(Nanowerk News) Working with atomic-scale particles known as quantum dots, a Missouri University of Science and Technology biologist hopes to develop a new and better way to deliver and monitor proteins, medicine, DNA and other molecules at the cellular level.
The approach would work much like a virus, but would deliver healing instead of sickness, says Dr. Yue-Wern Huang, associate professor of biological sciences at Missouri S&T. Huang is leading the research effort, which is funded through a $225,000 grant from the National Institutes of Health under the American Recovery and Reinvestment Act.
Huang's research involves constructing tiny vessels of cell-penetrating proteins to transport the quantum dots, along with proteins, medicine or DNA, into the cell and release them. He likens the process to the ancient story of the Trojan Horse, which according to Greek mythology was used to delivered Odysseus and his army into the enemy city of Troy. But in this instance, the vessel is a "protein transduction domain," the cargo consists of biomolecules or other therapeutic agents, and the walled city is the cell.
Essentially, the nontoxic protein transduction domain, or PTD, is derived from a virus that can penetrate the cellular membrane. But instead of spreading sickness, it would spread medicine or DNA.
Quantum dots are fluorescent semiconductor nanocrystals - specks that are only a few nanometers in size - that possess unusual physical and chemical properties, making them attractive as tools for new approaches to medicine. For example, Huang says, the fluorescence of quantum dots does not fade as quickly as that of traditional fluorescent dyes used for tracing or mapping in the body. Moreover, quantum dots have a longer half-life and are more resistant to degradation than traditional fluorescent dyes. Because of these qualities, quantum dots are more effective for detecting cancerous cells and other maladies, Huang says.
"Quantum dots are very photo-stable and they have a very high quantum yield. In other words, you don't need to use very much and it is very easy to detect under the microscope," he says.
Huang and his fellow researchers plan to synthesize cadmium-based fluorescent quantum dots, encapsulated by other elements to render the cadmium harmless, and attach them to protein transduction domain (PTD) materials. The quantum dot/PTD mixture is then combined with the cargo, placed into cell cultures and examined. Though early in the research, Huang says the material populates the cell cultures 10 times faster than a system without PTDs over an hour's time.
According to Huang, this work is unique because it involves the merger of two separate areas of biomedical study - quantum dot research and the PTD delivery system. Before this research, the two disciplines have never been merged, he says.
Huang projects "many potential long-term applications in biomedical areas" to come from this research. They include improvements in medical imaging and monitoring, as well as more efficient delivery of medicines and therapeutic agents at the cellular level and in humans.
Other Missouri S&T researchers working with Huang on the effort are Dr. Jeffrey Winiarz, an assistant professor of chemistry, who is creating the quantum dots, and Dr. Katie Shannon, assistant professor of biological sciences, who is providing bio-imaging expertise.
Source: Missouri University of Science and Technology Subscribe to a free copy of one of our daily | 科技 |
2017-09/1580/en_head.json.gz/3360 | Technology|Jack Wolf, Who Did the Math Behind Computers, Dies at 76
Technology Jack Wolf, Who Did the Math Behind Computers, Dies at 76
By DOUGLAS MARTINMAY 20, 2011
Jack Keil Wolf, an engineer and computer theorist whose mathematical reasoning about how best to transmit and store information helped shape the digital innards of computers and other devices that power modern society, died on May 12 at his home in the La Jolla section of San Diego. He was 76.The cause was amyloidosis, a disorder caused by the buildup of a complex protein in body tissue or organs, his daughter Sarah Wolf said.Dr. Wolf was lionized by information theorists and won many awards recognizing his contributions to the communications networks that now lace the earth. Devices like cellphones would not exist had not thinkers like Dr. Wolf come up with their mathematical underpinnings.“He made fundamental contributions in really all the various subdivisions of information theory,” said Paul Siegel, a professor at the University of California, San Diego, where Dr. Wolf also taught.The Institute of Electrical and Electronics Engineers called Dr. Wolf “one of the most productive cross-fertilizers in engineering research, successfully importing techniques used in one field to obtain unexpected results in another.”
Information theorists built the intellectual foundation of the computer age by using advanced algebra to devise ways to send, receive and store data. An essential idea, from the 1940s, was the binary language of computing using only 0’s and 1’s, later refined, elaborated and extended by scientists like Dr. Wolf.Information theorists’ most urgent mission is to condense information into the most efficient shorthand, or codes. This involves finding algorithms to turn data into electrical impulses, then to transmit those impulses and finally to decode them.“It’s a theory of ‘how little do we need to transmit to get information across?’ ” said Andrew Viterbi, creator of the pervasive Viterbi algorithm, which clears clutter from electronic messages.
Jack Keil Wolf
University of California at San Diego Dr. Viterbi said a method Dr. Wolf developed for compressing separate streams of data into a single message had uses in flash memory devices.Dr. Wolf later made advances in data storage, removing errors and clarifying fuzzy information retrieved from magnetic disks so that more data could be stored in less space. “This is at the heart of the information revolution,” said Lawrence Larson, a colleague of Dr. Wolf’s at the University of California.Dr. Wolf was born in Newark on March 14, 1935. He delighted in pointing out that Einstein was born on the same day and that the month and day, expressed in numbers, was the beginning of Pi, 3.14, the ratio of a circle’s circumference to its diameter.
He earned his bachelor’s degree from the University of Pennsylvania, and two master’s degrees and a Ph.D. from Princeton. He served in the Air Force and taught at New York University, the Polytechnic Institute of Brooklyn and the University of Massachusetts at Amherst before moving to San Diego.His most important early work was devising a theorem with David Slepian in 1973 proving that two separate streams of correlated data can be sent independently and simultaneously and then combined and simplified at journey’s end. An example would be neighboring temperature sensors independently sending data to a weather center. Decades later, building on the work of other theorists and engineers, the technique propelled the development of computer networks.“It sometimes take decades for the implementation of a technology to catch up with the concept,” Dr. Viterbi said.
Dr. Wolf was recruited by the University of California, San Diego for its new Center for Magnetic Recording Research in the mid-1980s. He said in a recent lecture that he knew nothing about magnetic recording at the time, and even mispronounced the word “coercivity,” which refers to a magnetic field’s intensity.Scientists then thought magnetic recording was boring, he said, adding, “Very smart people are sometimes wrong.”The center’s research helped increase the speed and capacity of magnetic hard drives while lowering their cost. Dozens of Dr. Wolf’s protégés, nicknamed “the Wolf pack,” fanned out to high-level, high-tech positions. Dr. Wolf was elected to the national academies of both engineering and science.In addition to his daughter Sarah, Dr. Wolf is survived by his wife, the former Toby Katz; another daughter, Jill Wolf; his sons Joseph and Jay; and five grandchildren.
A version of this article appears in print on May 21, 2011, on Page A20 of the New York edition with the headline: Jack K. Wolf, 76, Who Did The Math Behind Computers. Order Reprints| Today's Paper|Subscribe | 科技 |
2017-09/1580/en_head.json.gz/3365 | Will iris scans be the way our children see their future?
We must have the right to consent (or not) to the collection, usage and distribution of biometric information.
By Dorothy HukillSpecial to the Star-Banner
Six-year-old Susie is excited about her first day of school. She lets go of her mom's hand, looks back and waves at her as she climbs the steps of the big yellow school bus. When she reaches the top step, she presses her face against a machine that looks like binoculars — an iris scanner — which confirms that she has boarded the bus, and then she takes a seat next to her best friend. Fast-forward 12 years, and little Susie is all grown up and ready to buy her first car — but there is a problem. The car salesman explains to Susie that there is an issue with her credit, and they won't be able to finance the car she worked for throughout high school. As it turns out, Susie's identity was stolen by a hacker years before she was even old enough to know what credit was. Using her biometric information collected by her school, the hacker obtained loans and credit cards all during her school years.Is this a far-fetched scenario? Not really.Biometric information is any physical or behavioral information that is specific to a single person, such as a fingerprint, palm print, DNA, facial recognition, retina or iris, and voice print. These characteristics can be scanned, stored and used to confirm a person's identity — and who knows what else.While Susie can change her PIN number for her credit card to ward off unauthorized usage, the iris does not change over a person's lifetime.What type of information is being collected? Who has access to this information? Where is the information being stored, and is it secure? Can this information be gathered into larger data pools for a national database? Can corporations buy this information for their own commercial use? What happens when a student's biometric information is stolen? Are there health concerns? Have parents given their permission?Should we be concerned? The simple answer is YES. Some schools in Florida (thankfully not in my district of Volusia, Marion and Lake counties) have started collecting biometric information for various school applications — to speed up lunch lines, check out a library book, record attendance, track students' movements on campus and for boarding the school bus. Will testing, evaluation and health care information be added to the mix? The Polk County school district has already collected iris scans of approximately 750 schoolchildren without parental consent, so they could board the school bus. The Pinellas County school district has collected 50,000 palm scans so school children could pass through the lunch line.Is this only a Florida problem? No. Texas, California and New York have tried “smart” ID cards, which use radio frequency identification (RFID) chips to track a student's location on campus at all times. Maryland used palm scanners to access students accounts in cafeteria lines.Is this only a U.S. problem? No, countries throughout the world are busy collecting biometric information — Australia, Canada, Mexico, the Netherlands, the United Kingdom and on and on. India currently has the largest biometric information program in the world with more than 200 million enrolled, which is expected to increase to 1.25 billion people.In light of today's expanding and changing uses of technology and the real threat of identity theft, I am concerned we are moving forward without questioning the consequences to our privacy, civil rights and our economic freedom. The Florida Constitution, Article I, Section 23, expressly provides that “every natural person has the right to be let alone and free from governmental intrusion into the person's private life” and is justification for preventing intrusion by government into our lives and for preventing disclosure of personal information. However, if this information is taken without permission, stolen or compromised, how do you change or reset it? If we are going to allow the collection of this data, we need to have specific protocols, procedures and safeguards in place. We must be aware that this information is being collected and have the right to consent (or not) to its collection, usage and distribution. I plan to file legislation in the upcoming 2014 legislative session to protect and secure our students' and parents' rights. Our children depend on us to do everything possible to ensure that they are safe. As parents, educators, legislators and citizens, we must meet that challenge. Surely we can move a lunch line without this information. Dorothy Hukill is the state senator for District 8, representing parts of Marion, Lake and Volusia counties. She previously served in the Florida House of Representatives and as mayor of Port Orange. | 科技 |
2017-09/1580/en_head.json.gz/3425 | Study: Self-esteem can rise from Facebook but weight, credit suffer
By Deborah M. Todd Pittsburgh Post-Gazette
All those digital "likes" on Facebook could be the most unfriendly thing your virtual friends could do.
A steady diet of Facebook kudos leads to a loss of discipline, weight gain and credit card debt, according to a study conducted by professors from the University of Pittsburgh and Columbia University.
The study, "Are Close Friends the Enemy? Online Social Networks, Self-Esteem and Self Control," reveals that users who interact with close friends on Facebook experience a temporary increase in self-esteem. However, that boost in confidence is followed by a discernible lack of self-control.
Users whose social media engagement was above average were more likely to have "a higher body-mass index, increased binge eating, a lower credit score and higher levels of credit card debt," according to a report on the research by Andrew T. Stephen, assistant professor of business administration in Pitt's Katz School of Business, and Keith Wilcox, assistant professor of marketing at Columbia Business School.
So how exactly did praise become the enemy of discipline?
Mr. Stephen said social media users experience a so-called "licensing effect" in which positive reinforcement about their lives gives them license to treat themselves.
"It's like a present. It's saying I'm going to give myself a gift. I deserve to get that unhealthy snack instead of sticking to my diet for the day," he said.
The findings surprised the researchers, who went into the study simply hoping to better understand the psychology behind social media interactions.
"Social media is used by millions of people on a daily basis and some of them spend a lot of time on it. We knew it had to have some kind of effect and there was no one study designed to ask those questions," said Mr. Stephen.
The study -- which was published online in November in the Journal of Consumer Research and is scheduled to run in the publication's June 2013 print edition -- included five different experiments conducted with more than 1,000 Facebook users in the United States.
The experiments found that people showed elevated levels of self-esteem if they had previously indicated they had strong ties to Facebook friends. People with neutral or weak ties to Facebook friends showed no changes in self-esteem.
Positive feedback from close Facebook friends is only part of the confidence boost, said Mr. Stephen. A large part also is related to the "presentation bias" that comes with using the site to present one's most positive qualities to the public.
And for some, just about anything posted by them could be considered positive, according to readers who responded to a Post-Gazette post about attention-seeking Facebook friends.
"I wrote 'I might be a Facebook addict" on my status and got 37 likes in an hour. What does that say about me, not to mention my friends? I dunno ... anyway I'm going to the bathroom now," wrote respondent Mike Weis.
Charlene Rapp admitted she turned to Facebook to brag about receiving Pirates tickets as soon as she got them. "I just posted that I got my Pirates spring training tickets. Had to let all my friends in PA know what they're missing!" she wrote.
For respondent Randy Mogle, social media has become a critical factor in maintaining a strong marriage. "I told my wife, 'How can we communicate properly if you don't read my Facebook posts!" he wrote.
Facebook users don't get a boost in self-esteem from paying attention to information being shared with them, but they do when they focus on their own postings.
According to Mr. Wilcox, it's all the better if a post about a job promotion or a child's straight-A report card is backed by "likes."
In one experiment, participants browsed either Facebook or the CNN news site and then were given the option of eating either a granola bar or a chocolate chip cookie. More often than not, Facebook users made the less healthy choice, going for the cookie.
Mr. Stephen and Mr. Wilcox said they are working to examine how their research can be used for advertising and other commercial purposes, but they also want to also get the word out to social media users to they can put their behaviors in check.
"People aren't necessarily aware it's happening, but it's not unconscious [behavior]. If more people are aware of the trend, then more can control their behaviors," Mr. Wilcox said.
"It's OK to feel good about yourself, that's fine. But just because you feel good about yourself that doesn't mean you have to go out and splurge."
mobilehome - businessnews - health - yourbiz
Deborah M. Todd: dtodd@post-gazette.com or 412-263-1652.
// param1: health
param1 = "health"; | 科技 |
2017-09/1580/en_head.json.gz/3461 | President Obama Awards National Medals of Science to America’s Best and Brightest Scientific Minds
Obama calls celebrating their contributions "one of the most important ways to restore science to its rightful place"
On November 17th, President Obama presented 10 researchers with the highest technical and scientific award given by the United States, the the National Medal of Science.
"The achievements of these men and women stand as testament to their ingenuity, to their zeal for discovery and for their willingness to give of themselves and to sacrifice in order to expand the reach of human understanding," the President said at a ceremony at the White House on Wednesday evening. "The scientists and innovators here have saved lives, improved our health and well being, helped unleash whole new industries and millions of jobs, transformed the way we work, learn and communicate ... their contributions serve as proof not only to their creativity and skill, but to the promise of science itself." The National Medal of Science was created by statute in 1959, and is administered for the White House by the National Science Foundation (NSF). Awarded annually, the medal recognizes individuals who have made outstanding contributions to science and engineering. Nominees are selected by a committee of presidential appointees based on their advanced knowledge in, and contributions to, the biological, behavioral/social and physical sciences, as well as chemistry, engineering, computing and mathematics.
NSF Director Subra Suresh addressed the laureates at a black tie awards dinner that followed the formal ceremony at the White House. "Tonight we acknowledge not only the heritage that you received and have carried on, but even more so, we celebrate the legacy that you bestow to coming generations of explorers, discoverers, inventors, and innovators," said Suresh. "You feed our anticipation of continued contributions and leadership. And, we look forward to the future flowering of science and innovation in your hands and in the hearts and minds of those you mentor and train ... To put the minds and the hands of humanity to work in creating a better world for all."
On its 51st anniversary, this year's National Medal of Science recipients are:
Yakir Aharonov, Chapman University, for his work in quantum physics, which ranges from the Aharonov-Bohm effect, to the notion of weak measurement, making him one of the most influential figures in modern physics.
Stephen Benkovic, Pennsylvania State University, for his seminal research that has changed our understanding of how enzymes function, singly or in complexes, and has led to novel pharmaceuticals and biocatalysts.
Esther Conwell, University of Rochester, for promoting women in science, and for contributions to understanding electron and whole transport in semiconducting materials that has helped to enable integrated circuits and organic electronic devices.
Marye Anne Fox, University of California, San Diego, for seminal contributions to chemistry by elucidating the role that non-homogeneous environments can exert on excited-state processes, and enhancing our understanding of charge-transfer reactions and their application to such fields as polymers, solar energy conversion and nanotechnology.
Susan Lindquist, Massachusetts Institute of Technology, for showing that changes in protein folding can have profound and unexpected influences in fields as wide-ranging as human disease, evolution and nanotechnology, and for providing fundamental experimental support for the prion hypothesis. The prion hypothesis is a key scientific assertion associated with a group of progressive conditions that affect the brain and nervous system of many animals, including humans.
Mortimer Mishkin, National Institutes of Health, for fundamental contributions to understanding the functional organization of the primate brain, including the discovery of the role of the inferior temporal cortex in vision, delineation of the selective contributions of medial temporal lobe structures to memory, and discovery of the neural bases of cognitive and noncognitive memory systems.
David Mumford, Brown University, for extraordinary contributions to the mathematical, engineering and neurobiological sciences.
Stanley Prusiner, University of California, San Francisco, for his discovery of prions representing an unprecedented class of infectious agents comprised only of proteins, which elucidated a novel paradigm of disease in which a single pathogenic process produces infectious, inherited or sporadic illnesses in humans and animals.
Warren Washington, National Center for Atmospheric Research, for his fundamental contributions to the understanding of Earth's coupled climate system through numerical simulation, leadership in U.S. science policy, and inspiring mentorship of young people of all backgrounds and origins.
Amnon Yariv, California Institute of Technology, for scientific and engineering contributions to photonics and quantum electronics that have profoundly impacted lightwave communications and the field of optics as a whole.
Image Caption: NSF Deputy Director and Director bookend National Medal of Science laureates at the celebratory black tie dinner that followed the White House ceremony. From left to right, NSF Deputy Director Cora Marrett, Amnon Yariv, Warren Washington (former Chairman of the National Science Board), Stanley Prusiner, David Mumford, Mortimer Mishkin, Susan Lindquist, Marye Anne Fox, Esther Conwell, Stephen Benkovic, Yakir Aharonov and NSF Director Subra Suresh. Credit: Sandy Shaeffer for the National Science Foundation | 科技 |
2017-09/1580/en_head.json.gz/3491 | UC Riverside gets $5 million to study immortality
Stephanie O'Neill
A private foundation has awarded the University of California at Riverside $5 million to study age-old questions surrounding immortality and life-after-death. UCR
A private foundation has awarded the University of California at Riverside $5 million to study age-old questions surrounding immortality and life-after-death. They’re calling it the “Immortality Project,” and its goal is to apply rigorous scientific research to questions surrounding immortality and the afterlife. The John Templeton Foundation in Philadelphia awarded the grant to UCR philosopher, John Martin Fischer, the project’s lead investigator. Fischer, who studies free will and moral responsibility, admits that he’s not so sure this whole life–after-death thing even exists.
“I’m kind of a skeptic about an afterlife,” Fischer said in a Skype interview from Germany, where he’s a research fellow at the Center for Advanced Study in Bioethics, University of Muenster. “I’m inclined not to believe there is one, but I certainly don’t know.” Fischer said the research at UCR will last three years and will look into a wide range of immortality issues. Among them: the cultural differences that shape near-death experience, such as why Americans who have a near-death experience usually report a tunnel with a light at the end, while in Japan most who experience the phenomenon report tending to a garden.
Other research will delve into such issues as whether technological and medical advancements could create immortality or at the very least much longer life spans for humans. And if so, how would immortality affect the meaning and value we place on our lives? Or do we need death to give life meaning? “We can chip away at the problem by figuring out what features make life more meaningful and attractive and what features take that away,” he said of the project that will solicit research topics from scientists, philosopher, theologians and others worldwide, beginning Sept. 1, 2012 and will announce grants next year. Fischer said he’s allotting $2.5 million to fund up to 10 scientific research projects into various questions of immortality. Another $1.5 million will go to 15 philosophers and theologians to support them in writing articles and books. He said the research topics will also include such questions as: -- Whether and in what form a person could survive bodily death. -- Whether the information in our brains could be uploaded into a computer to allow one to exist there in perpetuity. -- How a person’s beliefs about immortality influence their behavior, attitudes, and character. The remaining $1,000,000, Fischer said, will fund post-doctoral and graduate students; the Immortality Project website and two conferences on immortality at UC Riverside. “One of the main goals I have is just to try and understand more about what we value and what we care about in our own finite lives,” Fischer said. “By studying pictures and conceptualizations of immortality -- in religion and in literature and science fiction -- we can come to figure out something about the meaning of our own finite lives.” The best SoCal news in your inbox, daily.
Infections linked to heart-surgery bug found in LA County
The Styled Side: Looking great with a medical condition
KPCC's Health coverage is a Southern California resource provided by member-supported public radio. We can't do it without you. | 科技 |
2017-09/1580/en_head.json.gz/3534 | science & astronomy
search for life
skywatching
Space.comScience & Astronomy
Phew! Universe's Constant Has Stayed Constant
By Clara Moskowitz, SPACE.com Assistant Managing Editor |
The Effelsberg radio telescope by night.
Credit: Paul Jansen
By peering at alcohol molecules in a distant galaxy, astronomers have determined that a fundamental constant of nature has hardly changed at all over the age of the universe.
The constant — the ratio of the mass of a proton to the mass of an electron — has changed by only one hundred thousandth of a percent or less over the past 7 billion years, the observations show.
The scientists determined this by pointing the Effelsberg 100-m radio telescope at a distant galaxy that lies 7 billion light-years away, meaning its light has taken that long to reach Earth. Thus, astronomers are seeing the galaxy as it existed 7 billion years ago. The telescope looked for special light features that reflect the absorption of methanol, a simple form of alcohol that contains carbon, hydrogen and oxygen.
If the ratio of the mass of the protons and electrons inside those atoms were different than it is here and now in our own galaxy, the scientists would be able to detect this in the properties of the light.
"This idea makes the methanol molecule an ideal probe to detect a possible temporal variation in the proton-electron mass ratio," astrophysicist Wim Ubachs of VU University Amsterdam said in a statement. "We proposed to search for methanol molecules in the far-distant universe, to compare the structure of those molecules with that observed in the present epoch in laboratory experiments."
Their observations confirmed that the proton-electron mass ratio has changed by no more than 10^-7 over the past 7 billion years. The universe itself is 13.7 billion years old. [The Universe: Big Bang to Now in 10 Easy Steps ]
The ratio of the mass of the proton to the mass of the electron is a type of fundamental constant, or a number that can't be deduced from theory, but must be measured in nature. If the value of this constant were very different than it is, then atoms might never have formed, and our universe would lack stars and galaxies and people.
And if the value of this constant had changed over time, it could have far-reaching consequences for the evolution of the universe.
"If you see any variations in that fundamental constant, then you would know that something is wrong in our understanding of the foundations of physics," said Karl Menten, director of the Max-Planck-Institut für Radioastronomie in Germany and head of the Institute's Millimeter and Submillimeter Astronomy Department. "In particular, it would imply a violation of Einstein's Principle of Equivalence, which is at the heart of his General Theory of Relativity."
The findings are detailed in a paper published in the Dec. 14 issue of the journal Science.
Follow Clara Moskowitz on Twitter @ClaraMoskowitz or SPACE.com @Spacedotcom. We're also on Facebook & Google+. Editor's Recommendations
5 Reasons We May Live in a Multiverse
Wacky Physics: The Coolest Little Particles in Nature
The History & Structure of the Universe (Infographic)
Author Bio Clara Moskowitz, SPACE.com Assistant Managing Editor
Clara has been SPACE.com's Assistant Managing Editor since 2011, and has been writing for SPACE.com and LiveScience since 2008. Clara has a bachelor's degree in astronomy and physics from Wesleyan University, and a graduate certificate in science writing from the University of California, Santa Cruz. To find out what her latest project is, you can follow Clara on Google+.
Clara Moskowitz, SPACE.com Assistant Managing Editor
Subscribe To SPACE.com
SUBSCRIBE TO SPACE | 科技 |
2017-09/1580/en_head.json.gz/3574 | 15 Biggest Fails for Techies Who Bought the Wrong Gear Too Early
By Jared Newman, PCWorld
It Doesn't Always Pay to Line Up Early
Whether or not you believe that early adopters of new tech always get screwed, history proves that buying the latest gadget on day one isn't always a great idea. Sometimes, expensive products quickly become dirt-cheap, hype transforms into obsolescence, and early support is rewarded with a stab in the back. Consider these examples of the 15 biggest early-adopter fails (and the year they let us down) and tell us if a tech company has ever treated you worse.You May Also Enjoy: Tech Fails of 2011: It's Not Too Soon to Admit Defeat
Sega 32X (1995)
Flying high off the success of the Genesis/Mega Drive, Sega became a little too eager to jump into the 32-bit era. The 32X, released in late 1994, was a peripheral that plugged into the top of the Genesis, and was supposed to be like an entirely new console. But with the launch of the Sega Saturn looming, most consumers cheerily ignored the add-on. Those who didn't were stuck with a product that Sega discontinued less than a year later.
Apple iMac G3 (1998)
When Apple released the first iMac all-in-one computer in August 1998, the machine was stylish, colorful, and hopelessly outdated almost immediately after purchase. Within the iMac's first eight months on the market, Apple upgraded the specs three times, so if you waited until April 1999, you had a 333MHz processor instead of a 233MHz CPU, an extra 2GB of storage, an additional 4MB of graphics memory and--most important for Apple fashionistas--a choice of five more colors.
Windows Me (2000)
By now every techie knows that Microsoft's Windows Me operating system was a buggy, unreliable mess. When the OS launched in September 2000, however, the attitude among pundits was more "you don't really need it" than "avoid it like the plague." Still, I pity anyone who ignored the advice just to have Microsoft's latest OS; PCWorld editors dubbed it the fourth-worst tech product of all time, and readers voted it the second-most annoying.
RIM BlackBerry 5810 (2002)
Research In Motion's transition in 2002 from PDAs to smartphones wasn't the most elegant. The company's first phone for the North American market, the BlackBerry 5810, lacked a speaker and microphone for voice calls, so you had to use a wired earpiece. CrackBerry addicts who couldn't wait for a proper phone got bad news three months later, when RIM unveiled a trio of new models that would launch later in 2002--no earpiece required.
Blu-ray Players 1.0 (2007)
The triumph of Blu-ray over HD DVD wasn't all good news for clairvoyant early adopters. Blu-ray players introduced after October 2007 include on-board storage; more important, however, they can play picture-in-picture commentary and connect to the Internet for interactive functions. Earlier models couldn't make the upgrade and gain those features. In other words: Thanks for your service in the format wars, but don't expect any additional benefits.
Original Apple iPhone (2007)
Although the iPad and iPod have conditioned people to expect yearly updates for Apple gadgets, the original iPhone was an infuriating exception when it launched in 2007. Two months after Apple released its game-changing smartphone, the price dropped to $399 for the 8GB model, down from $599. Apple then discontinued the 4GB version, which originally sold for $499. Early adopters were livid. One woman sued. To paraphrase Apple's response to its angry customers: Take this $100 store credit and shut up. The tactic must have worked: The next-generation iPhone 3G sold 1 million units in three days.
Original Microsoft Xbox 360 (2007)
Video game consoles are always expensive when they launch, but the Xbox 360 went above and beyond the call of punishing early adopters with the now-infamous Red Ring of Death. To fix overheating consoles, Microsoft set aside more than $1 billion and extended users' warranties to three years. But that wasn't the only gripe with early Xbox 360s; the first models also lacked HDMI output.
Windows Vista Upgrade Program (2007)
Thanks to a free upgrade program, Windows Vista's earliest adopters should have been the people who bought new computers shortly before the OS was ready. Instead, horror stories abounded, with PC makers taking weeks or months to push upgrades out to their customers. But look at it this way: By the time the upgrades arrived, some of Vista's bugs had been fixed.
The HD DVD Format (2008)
HD DVD is the quintessential cautionary tale against picking sides in a format war. The fight with Blu-ray barely lasted two years, beginning with the debut of Toshiba's first HD DVD player in March 2006 and ending with the company's abandoning the format in February 2008. HD DVD adopters were left with worthless hardware and an obsolete movie library. Warner Bros. offers an HD DVD to Blu-ray exchange program, but participation costs $5 per disc plus shipping and handling--just another expense for cutting-edge consumers.
T-Mobile G1 (2008)
Are you a Google diehard who brags about owning the first Android phone, the T-Mobile G1? Congratulations, you're also stuck with a smartphone that can't officially upgrade past Android 1.6. More upgrades are technically possible--hackers have even equipped their G1 handsets with Android 3.0 (Honeycomb)--but T-Mobile and HTC, the maker of many Android phones, probably are hoping you'll just buy a new phone. Next Prev
Lala Music Service (2009)
Although Lala began as a CD-trading service, eventually the company settled on a business model that let you buy unlimited streaming of individual songs for 10 cents each. Lala even had an iPhone app in the works that could cache songs for offline listening. But in December 2009, Apple acquired Lala; the company shut down the service five months later. As a consolation, Lala gave paying customers their money back in the form of iTunes credit at 99 cents a song, allowing them to retain just about a tenth of their streaming Lala libraries from the very service they were likely trying to avoid in the first place.
Amazon Kindle (2010)
The Kindle's transformation from luxury gadget to impulse buy isn't based on a single moment but rather on a series of price drops that broke the hearts of early adopters. If you bought a Kindle 2 in February 2009, it cost $359. Five months later, $299. Three months after that, $259. By June 2010, the Kindle 2 cost $189--and if you thought that was a good time to pull the trigger, July brought word of the Kindle 3, including a Wi-Fi model for $139. In less than a year and a half, the Kindle had become thinner, lighter, and $220 cheaper.
3DTV and James Cameron's 'Avatar' (2010)
Avatar was supposed to be the film that sold 3D to the masses, but even now--15 months after the film's box office debut--you can't buy a stand-alone retail copy of Avatar on 3D Blu-ray. That's because Panasonic locked up exclusive rights to bundle the movie with its 3D TVs and starter kits until February 2012. Trying to build a 3D Blu-ray library with a Samsung or Sony TV? Hope you like Piranha. Next Prev
Intel Sandy Bridge Processors (2011)
Product recalls are bound to happen sometimes, but unlike your typical faulty power plug or bad battery, the design defect in Intel's Sandy Bridge chipset means early adopters will have to send their whole computers back for repair. The problem, which can degrade data transfers over time, affects second-generation Intel 6 series chipsets with Core i5 and Core i7 quad-core processors purchased after January 9 and before January 31. So if you were one of the first in line for Intel's new processor with integrated graphics, I hope you still have an old computer for backup.
Motorola Xoom (2011)
Even if you despise Apple and loathe the iPad, give credit where it's due: When the company advertises a feature, it works on day one. Motorola's Xoom, on the other hand, shipped without Flash, MicroSD support, and 4G connectivity. And to get the free 4G upgrade, users will have to ship their tablets back to Motorola and leave the devices in the company's hands for at least six business days. Any bets on whether the Xoom's price will drop first?
The cord-cutting gear from CES that's most likely to find a place in our living rooms
The 10 craziest TVs at CES 2017
13 fabulous movies for streaming-in the new year
The best video-streaming apps for Apple TV, from A to Z Email "15 Biggest Fails for Techies Who..."
The best video-streaming apps for Apple TV, from A to Z TechHive | 科技 |
2017-09/1580/en_head.json.gz/3575 | Is this the end of Spotify Free?
Send and use your invites now - its days are numbered
Shares The launch of two new Spotify services - Unlimited and Open - means two things. One, you don't need to spend a tenner to get rid of the ads; and two, Spotify Free's days are numbered.It's clear that Spotify Open, which delivers 20 hours of ad-funded music per month, is going to replace Spotify Free.If you're already a Free user then things won't change in the foreseeable future, and you can still offer invitations to others; however, we wouldn't be surprised if the ability to invite people to Spotify Free disappears fairly quickly.Check out Spotify's exact words: people are "still able to sign up… by obtaining one of the many millions of invites currently available." The key word there is "currently".A refreshing admissionOn the face of it the inevitable demise of Spotify Free is a bad thing, but we think it's actually quite refreshing. Spotify is basically saying "look, we can't afford to run unlimited streaming for everyone for free, so here's your choice: 25 albums a month for nowt, or as much as you want, ad-free, for a fiver."The problem for Spotify is that the music business doesn't care whether you're a paying customer or not; it just wants to be paid for the music Spotify streams. In an ideal world the odd advert would cover the cost of the relevant licenses, but this isn't an ideal world. There simply isn't enough advertising cash to go around.That means Spotify had a stark choice. If it carried on as normal, it would have to find ways to make more money from free customers. That means more ads, more invasive ads, and more attempts to mine personal data from profit.It's what you might call the Facebook model, and while it works for Facebook - so far, anyway - there's no guarantee it'd work for Spotify. This way is better. The free service isn't too limited, and the ad-free version is now half the price it was previously (Premium remains, but you only need to pay the extra fiver for that if you want offline access or the mobile service).Where Spotify got it wrong was in having a free service that was just too good, and a paid-for service that was just a little bit too pricey. It's addressed both of those issues, and the result should be an increase in the subscription numbers.It's a smart move - but it might also be too little, too late. If the rumour mill is correct, a streaming, subscription-based version of iTunes may be launching in just a few weeks time.
ContributorFormer lion tamer, Girls Aloud backing dancer and habitual liar Gary Marshall (Twitter, Google+) has been writing about tech since 1998, contributing sage advice and odd opinions to .net, MacFormat, Tap! and Official Windows Magazine as well as co-writing stacks of how-to tech books. "My job is to cut through the crap," he says. "And there's a lot of crap." Related news | 科技 |
2017-09/1580/en_head.json.gz/3673 | Rubio to take on climate change in GOP's State of Union response | New Hampshire Contact us
Rubio to take on climate change in GOP's State of Union response
By ERIKA BOLSTADMcClatchy Newspapers
WASHINGTON - Sen. Marco Rubio will offer up the Republican response to President Barack Obama's State of the Union address this week, demonstrating the younger, more diverse face of the party as the nation confronts such issues as immigration.But Rubio doesn't think much of climate change, one of the other hot political topics of the moment. That puts the 41-year-old Rubio squarely in the anti-science wing of his party and among a shrinking number of Americans with doubts about global warming.The Florida senator earlier this month in an interview questioned whether "man-made activity" is contributing most to global warming, and he suggested there's reasonable debate on whether there's "significant scientific consensus" on the human role. He also questioned whether there's anything the government can do to make a difference."When you look at the cost-benefit analysis that's being proposed, if you did all these things they're talking about, what impact would it really have on these changes that we're outlining?" Rubio said during the interview with BuzzFeed. "On the other hand, I can tell you the impact it would have on certain industries and on our economy, and that's where it falls apart."The Sierra Club in Florida issued a statement that oozed with incredulity. In recent years, extreme weather has "seriously damaged Florida's infrastructure," said Frank Jackalone, the staff director of Sierra Club Florida.Already, local governments are developing regional plans to deal with rising seas, which are projected to make problems much worse along the Florida coastline. Studies show that Florida faces some dire consequences even with modest sea level rises. They include saltwater intrusion into freshwater supplies, damage to infrastructure such as roads and sewer lines, and flooding that could force people to abandon beachfront property. Some studies show sea levels rising as much as six and a half feet over the next century, said Harold Wanless, chairman of the geology department at the University of Miami."We cannot afford to sit idly while the threat of climate change becomes a dangerous reality," Jackalone said. "By denying the climate crisis and rejecting climate action, Marco Rubio's got his head buried in the sand - and that's a bad place to be when the seas are rising."A spokesman for Rubio, Alex Conant, said, "Sen. Rubio doesn't think that big government can control the weather. But big government can hurt Florida's economy and destroy jobs."There's no doubt that fiscal matters, immigration and gun control are expected to take precedence in the coming months. Yet Obama almost certainly will address climate change in his State of the Union speech. Obama said last month in his inaugural address that Americans have a moral obligation to address the consequences of global warming, and he's expected to offer more details Tuesday night.Already, federal agencies are beginning to issue climate adaptation plans that outline what can be done to limit exposure of federal programs, assets and investments to the impacts of climate change, including sea level rise or more frequent or severe extreme weather. The Environmental Protection Agency warned in its preliminary adaptation plan that until now, the agency "has been able to assume that climate is relatively stable and future climate will mirror past climate.""However, with climate changing more rapidly than society has experienced in the past, the past is no longer a good predictor of the future.". | 科技 |
2017-09/1580/en_head.json.gz/3729 | SPACE MISSIONSSpace Exploration Human Spaceflight
Robotic Spacecraft
Shop Windows to the UniverseEarth Science Rocks! Select one of our four cool NESTA t-shirts from our online store, and express your love of Earth and space science! RSS Feeds STS-29 Pictures of STS-29 crew and patch
Shop Windows to the Universe Science Store!Our online store includes books on science education, classroom activities in The Earth Scientist, mineral and fossil specimens, and educational games! Windows to the Universe Community NewsOpportunities
You might also be interested in:Traveling Nitrogen Classroom Activity KitCheck out our online store - minerals, fossils, books, activities, jewelry, and household items!...moreHubble Space TelescopeThe Hubble Space Telescope (HST) was one of the most important exploration tools of the past two decades, and will continue to serve as a great resource well into the new millennium. The HST found numerous...moreApollo 11Driven by a recent surge in space research, the Apollo program hoped to add to the accomplishments of the Lunar Orbiter and Surveyor missions of the late 1960's. Apollo 11 was the name of the first mission...moreApollo 12Apollo 12 was launched on Nov. 14, 1969, surviving a lightning strike which temporarily shut down many systems, and arrived at the Moon three days later. Astronauts Charles Conrad and Alan Bean descended...moreApollo 15Apollo 15 marked the start of a new series of missions from the Apollo space program, each capable of exploring more lunar terrain than ever before. Launched on July 26, 1971, Apollo 15 reached the Moon...moreDeep Impact Mission NASA chose Deep Impact to be part of a special series called the Discovery Program on July 7, 1999. The Discovery program specializes in low-cost, scientific projects. In May 2001, Deep Impact was given...moreGalileoThe Galileo spacecraft was launched on October 19, 1989. Galileo had two parts: an orbiter and a descent probe that parachuted into Jupiter's atmosphere. Galileo's main mission was to explore Jupiter and...moreLunar OrbiterDuring 1966 through 1967, five Lunar Orbiter spacecrafts were launched, with the purpose of mapping the Moon's surface in preparation for the Apollo and Surveyor landings. All five missions were successful....more | 科技 |
2017-09/1580/en_head.json.gz/3732 | Review of “Braintrust. What Neuroscience Tells Us about Morality”, by Patricia S. ChurchlandThe question of “where morals come from” has exercised philosophers, theologians and many others for millennia. It has lately, like many other questions previously addressed only through armchair rumination, become addressable empirically, through the combined approaches of modern neuroscience, genetics, psychology, anthropology and many other disciplines. From these approaches a naturalistic framework is emerging to explain the biological origins of moral behaviour. From this perspective, morality is neither objective nor transcendent – it is the pragmatic and culture-dependent expression of a set of neural systems that have evolved to allow our navigation of complex human social systems. “Braintrust”, by Patricia S. Churchland, surveys the findings from a range of disciplines to illustrate this framework. The main thesis of the book is grounded in the approach of evolutionary psychology but goes far beyond the just-so stories of which that field is often accused by offering not just a plausible biological mechanism to explain the foundations of moral behaviour, but one with strong empirical support. The thrust of her thesis is as follows: Moral behaviour arose in humans as an extension of the biological systems involved in recognition and care of mates and offspring. These systems are evolutionarily ancient, encoded in our genome and hard-wired into our brains. In humans, the circuits and processes that encode the urge to care for close relatives can be co-opted and extended to induce an urge to care for others in an extended social group. These systems are coupled with the ability of humans to predict future consequences of our actions and make choices to maximise not just short-term but also long-term gain. Moral decision-making is thus informed by the biology of social attachments but is governed by the principles of decision-making more generally. These entail not so much looking for the right choice but for the optimal choice, based on satisfying a wide range of relevant constraints, and assigning different priorities to them. This does not imply that morals are innate. It implies that the capacity for moral reasoning and the predisposition to moral behaviour are innate. Just as language has to be learned, so do the codes of moral behaviour, and, also like language, moral codes are culture-specific, but constrained by some general underlying principles. We may, as a species, come pre-wired with certain biological imperatives and systems for incorporating them into decisions in social situations, but we are also pre-wired to learn and incorporate the particular contingencies that pertain to each of us in our individual environments, including social and cultural norms. This framework raises an important question, however – if morals are not objective or transcendent, then why does it feel like they are? This is after all, the basis for all this debate – we seem to implicitly feel things as being right or wrong, rather than just intellectually being aware that they conform to or violate social norms. The answer is that the systems of moral reasoning and conscience tap into, or more accurately emerge from ancient neural systems grounded in emotion, in particular in attaching emotional value or valence to different stimuli, including the imagined consequences of possible actions. This is, in a way, the same as asking why does pain feel bad? Couldn’t it work simply by alerting the brain that something harmful is happening to the body, which should therefore be avoided? A rational person could then take an action to avoid the painful stimulus or situation. Well, first, that does not sound like a very robust system – what if the person ignored that information? It would be far more adaptive to encourage or enforce the avoidance of the painful stimulus by encoding it as a strong urge, forcing immediate and automatic attention to a stimulus that should not be ignored and that should be given high priority when considering the next action. Even better would be to use the emotional response to also tag the memory of that situation as something that should be avoided in the future. Natural selection would favour genetic variants that increased this type of response and select against those that decoupled painful stimuli from the emotional valence we normally associate with them (they feel bad!). In any case, this question is approached from the wrong end, as if humans were designed out of thin air and the system could ever have been purely rational. We evolved from other animals without reason (or with varying degrees of problem-solving faculties). For these animals to survive, neural systems are adapted to encode urges and beliefs in such a way as to optimally control behaviour. Attaching varying levels of emotional valence to different types of stimuli offers a means to prioritise certain factors in making complex decisions (i.e., those factors most likely to affect the survival of the organism or the dissemination of its genes). For humans, these important factors include our current and future place in the social network and the success of our social group. In the circumstances under which modern humans evolved, and still to a large extent today, our very survival and certainly our prosperity depend crucially on how we interact and on the social structures that have evolved from these interactions. We can’t rely on tooth and claw for survival – we rely on each other. Thus, the reason moral choices are tagged with strong emotional valence is because they evolved from systems designed for optimal control of behaviour. Or, despite this being a somewhat circular argument, the reason they feel right or wrong is because it is adaptive to have them feel right or wrong. Churchland fleshes out this framework with a detailed look at the biological systems involved in social attachments, decision-making, executive control, mind-reading (discerning the beliefs and intentions of others), empathy, trust and other faculties. There are certain notable omissions here: the rich literature on psychopaths, who may be thought of as innately deficient in moral reasoning, receives surprisingly little attention, especially given the high heritability of this trait. As an illustration that the faculty of moral reasoning relies on in-built brain circuitry, this would seem to merit more discussion. The chapter on Genes, Brains and Behavior rightly emphasises the complexity of the genetic networks involved in establishing brain systems, especially those responsible for such a high-level faculty as moral reasoning. The conclusion that this system cannot be perturbed by single mutations is erroneous, however. Asking what does it take, genetically speaking, to build the system is a different question from what does it take to break it. Some consideration of how moral reasoning emerges over time in children would also have been interesting. Nevertheless, the book does an excellent job of synthesising diverse findings into a readily understandable and thoroughly convincing naturalistic framework under which moral behaviour can be approached from an empirical standpoint. While the details of many of these areas remain sketchy, and our ignorance still vastly outweighs our knowledge, the overall framework seems quite robust. Indeed, it articulates what is likely a fairly standard view among neuroscientists who work in or who have considered the evidence from this field. However, one can presume that jobbing neuroscientists are not the main intended target audience and that both the details of the work in this field and its broad conclusions are neither widely known nor held. The idea that right and wrong - or good and evil - exist in some abstract sense, independent from humans who only somehow come to perceive them, is a powerful and stubborn illusion. Indeed, for many inclined to spiritual or religious beliefs, it is one area where science has not until recently encroached on theological ground. While the Creator has been made redundant by the evidence for evolution by natural selection and the immaterial soul similarly superfluous by the evidence that human consciousness emerges from the activity of the physical brain, morality has remained apparently impervious to the scientific approach. Churchland focuses her last chapter on the idea that morals are absolute and delivered by Divinity, demonstrating firstly the contradictions in such an idea and, with the evidence for a biological basis of morality provided in the rest of the book, arguing convincingly that there is no need of that hypothesis.
oxytocin,
The HedoneseJuly 14, 2012 at 3:54 AMGood article! Main point: the idea of good and evil is ultimately a 'powerful and stubborn illusion'. But it is impossible to live as though heinous acts such as 'torturing babies' is merely illusory, pragmatic or socially constructed. A crucial element of morality is its 'oughtness' which is not so much explained as denied. Greg Koukl put it nicely: "Evolution may be an explanation for the existence of conduct we choose to call moral, but it gives no explanation why I should obey any moral rules in the future. If one countered that we have a moral obligation to evolve, then the game would be up, because if we have moral obligations prior to evolution, then evolution itself can't be their source."More on Monkey Morality: http://tinyurl.com/cbsqmfReplyDeleteRepliesKevin MitchellJuly 16, 2012 at 5:53 AMI think what needs explanation is why moral choices have that "oughtness" about them that many people would say is innate. Why do they feel that way? The argument I put forward above is that it is adaptive to have them feel that way as an unconscious motivation to behave in an evolutionarily adaptive way.DeleteJames MckenzieJune 16, 2016 at 2:23 PMHitler believed that the Jews were an inferior race and less evolved than Germans. TO breed with them would taint the German blood line. Hitler sought to irradiate the Jews by slaughtering them by the thousands. His thinking was based on evolutionary thinking of helping the more fit to survive. According to his own relative morals, he was right. But was Hitler right? Your article fails to answer more concrete issues because it beats around the bush and confuses recognition of morals and pragmatic following of morals versus an actual "oughtness" about morals. With what you have written, what basis or standard do you have to assert that Hitler was in fact wrong? Or are you going to assert that he was subjectively right according to his culture and laws?DeleteKevin MitchellJune 17, 2016 at 12:03 AMI don't see how you get that from what I wrote. Part of the point was that we don't all individually get to choose our "relative morals", as you say. They are an evolved set of predispositions that generally favour social cooperation, fairness, reciprocity, harm avoidance, etc. Just because they don't exist in the abstract as "right and wrong" does not mean they don't exist at all and certainly does not mean those terms can't be applied to human behaviours. DeleteReplyShawn LorenzanaSeptember 22, 2012 at 8:07 AMMorality is relative to the individual or group. It mainly has to do with subjective experience. Other factors involve what we are and the proccess of how we came to be.ReplyDeletegamefan12March 22, 2013 at 6:19 PMThis comment has been removed by a blog administrator.ReplyDeleteUnknownApril 24, 2013 at 10:05 AM"While the Creator has been made redundant by the evidence for evolution by natural selection"...umm you're wrong. Your assertion assumes that evolution attempts to answer the origin of life. It does not. The question of creation and of a creator are not asked nor answered by natural selection. Evolution answers the question "why is there diversity in biological life?" It simply does not attempt to answer the question (indeed it does not have one) what is the origin of life. You have turned evolution into a god...that is not the intent of the theory/science of evolution.ReplyDeleteSaha WilliamNovember 6, 2013 at 11:04 PMThis comment has been removed by a blog administrator.ReplyDeletehannah williamNovember 7, 2013 at 2:02 AMThis comment has been removed by a blog administrator.ReplyDeleteRepliesSaha WilliamNovember 7, 2013 at 2:07 AMThis comment has been removed by a blog administrator.Deletehannah williamNovember 7, 2013 at 2:10 AMThis comment has been removed by a blog administrator.DeleteReplyKhasiat Daun AlamiNovember 27, 2013 at 7:59 AMThis comment has been removed by a blog administrator.ReplyDeleteKamrulNovember 29, 2013 at 4:32 AMThis comment has been removed by a blog administrator.ReplyDeleteDean JonesDecember 10, 2013 at 10:12 AMThis comment has been removed by a blog administrator.ReplyDeletesuzi cohenSeptember 29, 2014 at 11:25 PMThis comment has been removed by the author.ReplyDeleteRepliessuzi cohenOctober 13, 2014 at 4:38 AMThis comment has been removed by the author.DeleteReplyDaniel Oset RomeraJune 22, 2015 at 10:21 AM"While the Creator has been made redundant by the evidence for evolution by natural selection and the immaterial soul similarly superfluous by the evidence that human consciousness emerges from the activity of the physical brain"Completely false. Please don´t spread lies.ReplyDeleteAdd commentLoad more...
Complex interactions among epilepsy genes | 科技 |
2017-09/1580/en_head.json.gz/3768 | Russia Takes First Step To Permanent Moonbase
Russia And EU Reveal Plans For A 3D Printed Moonbase
October 31, 2015Anatol Locker3D Printing in Space 28
What started out as an idea has now a mission plan and a timetable. Despite difficult political times, Russia and the EU are planning to build a permanent lunar base together. The goal is to erect a 3D printed habitat, so astronauts can find water on the dark side of the moon. This Tuesday, Russia’s space agency announced plans for its first manned lunar mission. It will also prepare the nation’s first permanent 3D printed moonbase. The manned mission to the moon is scheduled for 2029. Head of the Russia’s space agency Vladimir Solntsev confirmed that in a press briefing in Moscow.
Earlier this month, European Space Agency ESA already announced its collaboration with the Russian space agency for the planned permanent moon habitats.
Luna 25 mission, which is scheduled for 2021, will launch an unmanned mission to Moon’s south pole. The goal is to investigate suitable target areas, find materials for generating oxygen, and to search for the best place to “start the colony”.
The possible partnership of ESA and Russia’s space agency is for Luna 27 in 2029. ESA’s participation in the colony mission is due to receive final approval in 2016.
How To Build a 3D Printed Moonbase
One of the mission highlights is the use of 3D printers building permanent structures made from lunar soil. ESA is now examining the possibility of using Moon’s soil as the main ingredient (All3DP reported).
The process will involve large-scale 3D printing with D-shape printers, which were developed by Enrico Dini.
These 3D printers are working by the same principle as FDM 3D printers on earth. D-shape printing works by putting down a layer of fine granulate and adding an inorganic binder. Then the process is repeated layer by layer, until the structure is finished.
By using lunar soil, ESA could save a huge amount of money by avoiding unnecessary space freight to the Moon. Only the machine parts and the binding materials have to be shipped to the lunar surface.
Current D-Shape printing technology is using two inorganic binders, magnesium chloride and metallic oxide. Both materials aren’t available on the moon. So one of the challenges is to use the minimum amount of binder per volume of the lunar soil without sacrificing rigidity.
I’ll See You On The Dark Side Of The Moon
The target landing site is the Aitken crater, moon’s biggest crater. It is located near the Moon’s south pole. The size of the South Pole-Aitken Basin led scientists to believe that the impact should have exposed the lunar mantle and caused the basin to be full of mantle material.
Some scientists call this place ‘far side’ because the environment is very different. Unlike other regions on Moon, the Aitken region has areas that are always dark. Scientists believe it offers ‘icy prisons’ with water and other chemicals that can help mission crew members build “buildings” and generate supplies.
In addition to Europe, Russia is also planning to open a dialogue with China. The two nations are reportedly looking to create a permanent space station that will assist future space missions to Moon and beyond.
License: The text of "Russia And EU Reveal Plans For A 3D Printed Moonbase" by All3DP is licensed under a Creative Commons Attribution 4.0 International License.
Recommend For You Topics3D Printing in Space | 科技 |
2017-09/1580/en_head.json.gz/3790 | Raytheon receives first order for GPS anti-jam land product
By Defense Systems StaffDec 04, 2012
Raytheon UK, a subsidiary of the Raytheon Company, was awarded what it describes as a “significant” contract by the UK Ministry of Defense for delivery of a new GPS anti-jam antenna system. The contract is for an undisclosed number of systems for deployment in operational theaters spanning multiple vehicle platforms. This Urgent Operational Requirement contract is the first award for Raytheon's GPS Anti-Jam Land product family. Raytheon UK has deliver more than 7,000 GPS anti-jam systems for air and naval capabilities in the UK and U.S.
The contract will see the deployment of the systems under a “very short timescale,” with final delivery of the capability expected to be completed six months from contract award. | 科技 |
2017-09/1580/en_head.json.gz/3795 | Share 15 December, 2011
SAY Media acquires ReadWriteWeb
By Patricio Robles
Yesterday ReadWriteWeb, a popular technology blog founded by Richard MacManus in 2003, announced that it is being acquired by digital publishing upstart SAY Media. Terms of the deal were not disclosed, but according to TechCrunch's sources, the deal was under $5m.
SAY Media has been active on the acquisition scene, having snapped up web properties including Dogster, Remodelista, a digital agency called Sideshow and publishing platform company Six Apart. The apparent strategy; instead of simply building an ad network for new media, SAY Media wants to consolidate the market and own the properties it sells against.
Will the next media empire be built on a blog? A few years ago, many were arguing that the answer was almost certainly a resounding yes.
With traditional publishers struggling, it seemed logical that new media empires would form online, where it's possible to publish more quickly and more cost-effectively.
But the number of high-profile, independent blogs remaining, particularly in the tech space, continues to shrink.
On paper, Say Media's move sounds like a decent idea. The roll-up approach isn't new, and although it probably fails just as much (if not more) than organic approaches, the timing might be right.
Although ReadWriteWeb has faced traffic challenges recently, and has lost some of its most important writers, it was still a well-known tech blog. If it is true that SAY Media acquired it for less than $5m, it could arguably represent a more sensible deal than AOL's purchase of TechCrunch for an amount rumoured to be in the $30m range. After all, post-acquisition, TechCrunch has lost its founder and editor Michael Arrington, as well as many of its most popular writers.
TechCrunch-like outcomes seem to be the rule, not the exception. The Guardian's acquisition of paidContent, for instance, has not worked as expected, and The Guardian is reportedly trying to unload the blog for the same price it paid for it or less (with no interested parties chomping at the bit to close a deal apparently). Even AOL's purchase of The Huffington Post, the largest of its kind, doesn't look all that sensible today given AOL's continued decline.
So does all of this mean that new media was overhyped? Perhaps. A more accurate conclusion, however, may be that new media isn't really all that different from old media. In other words, building a media empire is tough and as even the most successful pure-play digital publishers are learning, the rewards aren't always as great as many once expected them to be.
Published 15 December, 2011 by Patricio Robles Patricio Robles is a tech reporter at Econsultancy. Follow him on Twitter.
2450 more posts from this author
Blogging, Techcrunch, Blogs, Aol, Old Media, Read Write Web, New Media, The Guardian, Paid Content, Say Media, Social, Strategy & Operations, Technology, Media & Telecoms | 科技 |
2017-09/1580/en_head.json.gz/3821 | I let the animals lead and invite them to express their own sensibilities in their own voices. A conversation with an animal begins by watching gestures and reading facial cues. It is a nonverbal conversation. You do not think an elephant. You try to feel it.
Since 1992, Gregory Colbert has launched expeditions on every continent to collaborate with more than a hundred animal species and the people who share their native environments. His project has taken him to such places as Antarctica, India, Egypt, Burma, Tonga, Australia, Malaysia, Sri Lanka, Namibia, Kenya, Tanzania, Thailand, China, the Arctic, the Azores, and Borneo. Elephants, whales, manatees, sacred ibis, antigone cranes, royal eagles, gyr falcons, rhinoceros hornbills, cheetahs, leopards, African wild dogs, caracals, baboons, eland, meerkats, gibbons, orangutans, penguins, pandas, polar bears, lions, giant Pacific manta rays, and saltwater crocodiles are among the animals he has filmed and photographed. Human collaborators include San bushmen, Tsaatan, Lissu, Massai, Chong, Kazakh eagle hunters, and people from other indigenous tribes around the world. Colbert explains, “I try to create a climate of trust that opens the way for spontaneous interactions with animals. You cannot chart the course of a whale, dictate the wanderings of a cheetah, direct the gestures of an orangutan, or choreograph the flight of an eagle. I spend a great deal of time studying the natural behavior of animals while being mindful of their individual personalities. I believe the Australian Aboriginals were exploring the same enchantments when they painted animals; they were not interested in merely painting the contours of their bodies. They also focused equally on the animal’s interior dream life. The cave paintings of the San from the Kalahari Desert in Africa and the art of other indigenous tribes around the world also demonstrate their ability to look at animals from the inside out. That is what inspired me to begin Ashes and Snow in 1992. Our perception of nature had been human-centric. I hope to see the world through the eyes of a whale, an elephant, a manatee, a meerkat, a cheetah, an eagle, and I have tried to leave the windows and doors open so that others can share the same amazement I felt during each work’s creation.
“An elephant with his trunk raised is a ladder to the stars. A breaching whale is a ladder to the bottom of the sea. My films are a ladder to my dreams.” | 科技 |
2017-09/1580/en_head.json.gz/3878 | Rocks And Landform ProjectAlex Breeden 7th Grade Cool Rays
Igneous
Igneous rocks are formed by solidification of molten lava.Basalt: Basalt is a fine grained, dark-colored rock. People use basalt for many purposes, crushed basalt is used for road base, concrete aggregate, railroad ballast, and lots of other things. When they are freshly made, they are dark black, but after weather changes the rock turns into a reddish, greenish, or blueish color.Granite: Granite is a rock with grains visible with the naked eye, coarse grained, and its light-colored. Granite is mainly composed of quartz. The mineral composition of the rock gives granite a pink, red, grey, or white color and dark mineral grains throughout the rock. This is the best known igneous rock, its used to make countertops, floor tiles, paving stone, and more.Obsidian: Obsidian is an igneous rock that forms when molten rock cools rapidly that atoms can't arrange themselves into a crystalline structure. The most common color of obsidian is black. It can be also be brown, green, or tan, but you rarely see obsidian in blue, pink, or red. Two colors of obsidian are swirled together, like brown and black, they are the two most common colors swirled together. Obsidian is a popular gemstone. Its hardness is a 5.5 which makes it easy to carve into.Pumice: Pumice is a light-colored, extremely porous igneous rock. It forms during volcanic explosions. The holes in the rock are formed by gas bubbles that were once in the rock. The most common use of pumice rocks are using them in lightweight concrete blocks, and other lightweight concrete products.Pegmatite: Pegmatite are extreme igneous rocks that form during the last stage of magma's crystallization. They are extreme because they contain large crystals, and they have minerals that are rarely found within them. They have a limited use as an architecture stone, it is rarely used in anything. The world's best gemstones are found in pegmatite.
Metamorphic rocks are rocks that were once a form of rock but changed to another under the influence of heat, pressure, or some other agents.Slate: Slate is a fine-grained, foliated metamorphic rock that is caused by the alteration of shale. It is popular for its use for flooring, roofing, and flagging cause of it's durability and attractiveness. Mostly slates are gray and they come from light to dark grey. Slate also occurs in shades of green, red, black, purple and brown. The color of slate is mostly committed by the amount, type of iron, and organic material that are in the rock.Marble: Marble is a metamorphic rock that forms when limestone is put through the heat and pressure of metamorphism. Marble is crushed and used as an aggregate in railroad beds, highways, and buildings. Marble is usually a light color and has the hardness of 3 on the Mohs hardness scale.Soapstone: Soapstone is a metamorphic rock that is primarily composed of Talc. It is soft and easy to carve, it's heat resistant, and has high specific heat capacity. Soapstone is used in a variety of things that are used or are in the kitchen. For some examples are countertops, bowls, plates, sinks, electrical panels, wall tiles, and floor tiles.Quartz: Quartz is a coarse grained, metamorphic rock. It can be either white, yellow, or brown. When you crack it open, it goes around the grains revealing a smooth surface. It can be used for buildings, walls, floor tiles, or roofing. It can be found in mountainous locations.Gneiss: Gneiss is a coarse grained, foliated rock formed by regional metamorphism, its also metaphoric rock. The grains of Gneiss are elongated by pressure and the colors are usually light and dark bands. The lighter bands usually contain quartz.
Sedimentary Rock
Sedimentary rocks are types of rock that are formed by the deposition of material at the Earth's surface and within bodies of water.Shale: Shale is a fine-grained sedimentary rock. Black shale has organic materials in it which makes gas. Other shales can be brushed and used for clay. The rock is laminated meaning that it is made out of different layers.Sandstone: Sandstone is a sedimentary rock made out of sand sized grains. This is one of the most common sedimentary rocks there is. The particle size of the grains in the rock rather than the material of which it is composed is what sand means to the geologist.Coal: Coal is an organic sedimentary rock that forms from the preservation of plants. Coal is used for heat, fuel, and oil. Those are the most common things which they use coal for.Chert is a fine-grained silica-rich microcrystalline, cryptocrystalline or microfibrous sedimentary rock that may contain small fossils.Limestone: Limestone is sedimentary rock composed of calcium carbonate. It mostly forms in clear waters. Limestone can also from from evaporation. Limestone is not found everywhere, people pay a lot more for limestone if it isn't found in a place near them. | 科技 |
2017-09/1580/en_head.json.gz/3896 | South and Southeast Asia Video Archive
http://digital.library.wisc.edu/1711.dl/SEAvideo
Search the collectionGuided search Browse
South & Southeast Asia Video…
Video technology has altered the way people view themselves, their nations, and the world in general. It has long been recognized as a powerful cultural force in Western countires, but the rapid expansion of this technology to the rest of the world has largely gone unrecognized. Every country has television broadcasting capability, and the number of television sets steadily increases, even in the most impoverished countries of the “third world.” Television viewing is not just a middle and upper class phenomenon in these societies. Indeed, probably the most notable change in village life in South and Southeast Asia over the past decades has been the introduction of video technology. The video cassette recorder/player has added a new dimension to mass media consumption patterns, permitting more frequent and convenient access to video programming which is not directly mediated by government agencies.
Today television, radio, and film are major mechanisms of information transfer and consensus building. These media do not merely provide information about national and world events they explain and interpret their meaning. They also express the dominant culture and maintain a commonailty of values while offering the opportunity to examine both domestic subcultural and foreign value systems. In addition, these media can facilitate and coordinate programs intended to achieve national goals by announcing societal objectives and the means intended to achieve them.
The tapes acquired for this collection are produced by Asians for Asians, thus providing a far different cultural perspectives than those produced by Westerners about the region. Despite the quality of such productions as the film The Jewel in the Crown or the television series Vietnam, these productions provide approaches to subject matter which are clearly Western in their orientation. The same topics produced in an Asian cultural context would provide quite different and, probably, more revealing perspectives. The general criteria used in selecting these videotapes will be:
Expressions of the past and present “great traditions” of the various cultures of South and Southeast Asia published by both private and government agencies.
Samples of productions for the mass “pop culture” market, including popular broadcast TV programming as well as taped versions of locally produced motion pictures of broad commercial appeal.
Productions by governmental agencies with the specific purpose of educating viewers about government economic, social and political policies and objectives.
The South and Southeast Asia Video Archive has three principal objectives: first, to produce archival master tapes of the highest possible quality; second, to produce circulating copies which will be loaned to scholars throughout the country; and third, to catalog and publicize the collections broadly. The intent of the project is not just to make current materials available, but to house masters in a preservationally sound manner.
The South and Southeast Asia Video Archive was initiated by Jack Wells, the South Asian Bibliographer at the University of Wisconsin-Madison until 1996. The collection has had generous support from the United States Department of Education, the Luce Foundation, and the General Library System of the University of Wisconsin-Madison.
Image Source: Movie house off main street from the Joel M. Halpern Laotian Slide Collection
Click to read More Update (October 2015): The Archive may now be considered closed, as since the late 1990s additional videos have been directly added to the Libraries’ main catalog, and not included in the Archive.
Related MaterialsSouth and Southeast Asia research guideSoutheast Asian Images & TextsSouth Asian Studies CollectionSoutheast Asian Studies Collection Contact Us | 科技 |
2017-09/1580/en_head.json.gz/3966 | Birmingham water science leads ecological survival battle
Scientists at the University of Birmingham have developed tools to help restore vital eco-systems found in tropical mangrove forests around the world.
Hydrology experts at the University worked with counterparts in the Netherlands to test methods for measuring water levels in mangrove restoration projects. They then developed recommendations to help boost the likelihood of success for such projects.
The team, which included scientists from Wageningen University and Eijkelkamp Soil & Water conducted fieldwork in three mangrove regions in south-east Asia: Can Gio and Ca Mau, in Vietnam and Mahakam, in Indonesia.
Their research demonstrated that using local water level data in restoration projects is a potentially powerful tool to help reinstate valuable vegetation and trees. The team developed a restoration 'toolkit' of procedures that should help ecologists to get the most out of restoration projects.
University of Birmingham Water Science Lecturer Dr Anne Van Loon said: "Mangrove restoration projects often fail because hydrological conditions are disregarded. We have developed a simple, but robust, toolkit to determine hydrological suitability for mangrove species and guide restoration practice. "Mangrove forests are valuable coastal ecosystems in tropical coastal regions around the world, but increased pressures in these regions, such as logging, aquaculture and coastal development threaten their existence."
"They have a vital role to play in tropical countries for coastal protection, ecosystem functioning and supporting the livelihoods of coastal communities. Restoration projects have been set up but there is little scientific support to guide the restoration practice. "Successful restoration often comes down to a particular site's conditions being suitable for mangrove survival. Our research gives mangrove restorations practitioners the scientific background and practical tools to take these site conditions into account."
Dr Van Loon, from the University's School of Geography, Earth and Environmental Sciences and The Birmingham Institute of Forest Research (BIFoR), added that salinity, soil conditions and hydrology were all important factors in determining a project's chances of success, but hydrology was often overlooked in mangrove restoration, making it an important reason for failure.
Mangrove seedlings, for example, are often planted in the mudflat zone, which is too wet for growth. Mangrove vegetation fails to recover in abandoned shrimp ponds because of impaired water flow. Restoring the hydrology of impounded mangrove areas has proven to lead to successful restoration in Florida, Costa Rica, the Philippines and Thailand, but mangrove organisations need more useful tools to take hydrology into account in their restoration projects.
Carrying out field work in south-east Asia, the team measured water levels and the composition of vegetation species composition to place sites into hydrological classes.
This showed that in some locations hydrological conditions had been restored enough for mangrove vegetation to establish. In some locations, water conditions were too wet for any mangrove species to grow, whether natural or planted. The team also measured the effect that removal of obstructions such as dams would have on the hydrology and found that failure of planting could have been prevented. Based on this research the scientists developed the toolkit of measures to improve the effectiveness of mangrove restoration projects. This included recommendations to measure water levels over a minimum period of 30 days to observe the impact of tidal cycles, as well as calculating the movement of water across a restoration site. Professor Rob MacKenzie, Director of Birmingham Institute of Forest Research (BIFoR), said: "This innovative study is a great example of how the University of Birmingham is working internationally with colleagues to produce high-quality research with a global impact. The team's work will be of immense value in the battle to restore tropical mangrove forests around the world."
Tony Moran
t.moran@bham.ac.uk
HYDROLOGY/WATER RESOURCES
POLLUTION/REMEDIATION
Related Journal Article
http://dx.doi.org/10.1371/journal.pone.0150302 More in Earth Science
The never-ending story: Chemicals that outlive -- and harm -- us
Green Science Policy Institute
New study gives weight to Darwin's theory of 'living fossils'
Offshore wind push
NASA sees development of South Pacific's Tropical Cyclone Bart
View all Earth Science news Trending Science News | 科技 |
2017-09/1580/en_head.json.gz/3982 | Nearly 200 Federal Buildings Grow Less Green
By Kedar Pavgi
GSA Credits Recovery Act for Energy-Efficient Buildings
Obama Wants $2B in Energy-Efficiency Upgrades to Federal Buildings Over the Next 3 Years May 9, 2014
GSA Teams With Army on Pentagon’s Largest Solar Power Project Yet
With Staff Changes, EPA Sharpens Climate Focus
The Eisenhower Executive Office Building used 23.6% more energy in 2012 than in 2011, at an additional cost to taxpayers of $338,735.
Some federal buildings are becoming more environmentally friendly and saving thousands of dollars of taxpayer money in the process. Others not so much.
Fifteen federal buildings won commendation in the 2012 Energy Star National Building Competition for exceeding efficiency benchmarks. The competition -- administered by the Environmental Protection Agency -- consisted of 3,000 buildings across the country, including schools, businesses and government properties.
Another 179 of the federal buildings in the competition reported negative energy efficiency ratings in 2012, meaning they used more resources for their daily operations than in the previous year, according to a Government Executive analysis of the Energy Star data. The Eisenhower Executive Office Building, for instance, which houses most White House offices, used 23.6 percent more energy in 2012 than it did in 2011, at an additional cost to taxpayers of $338,735.
An EPA spokeswoman conceded the lost ground. “These buildings serve as good reminders that maintaining good energy performance doesn’t just happen automatically,” she said in an email. “Most of us are aware in our own lives of how the pounds can creep on when you’re not paying attention. The same goes for buildings." She noted an EPA study finding commercial buildings that tracked their usage over three years reduced their consumption by an average of 7 percent.
Dan Cruz, a spokesman for the General Services Administration, attributed some of the additional energy consumption to renovations. “A variety of factors could have contributed to a building's increase in energy use, like construction projects and occupancy changes for example," he said. "As construction projects and renovations were completed in some of these facilities, energy use increased as tenants moved back in and the buildings returned to full occupancy. The renovation work itself can also contribute to greater energy use at the site for the duration of the project." The Eisenhower Executive Office Building has been under major construction for years. Among federally owned buildings lauded for their green efforts, the Martin Luther King Jr. Courthouse in Newark, N.J., and the San Antonio Federal Building reported the greatest reductions in energy use -- 36.8 percent and 34.4 percent savings, respectively. Twenty-four GSA-owned buildings became 20 percent more efficient, according to the Energy Star data.
All buildings eligible for recognition in the competition had their data verified for the baseline and comparative periods (calendar years 2011 and 2012) by a licensed professional engineer or registered architect, EPA said. The federal 15 buildings that received commendations saved taxpayers a combined $961,470. Individually, the buildings saved between $1,305 for the U.S Border Station building in North Troy, Vt., and $371,720 for a GSA warehouse in Franconia, Va.
“GSA is proud to have so many of our buildings meet top rankings in such a wide competition,” the agency’s Public Buildings Service Commissioner Dorothy Robyn said in a statement. “We are using variety of strategies to make our existing buildings more energy efficient, and they are paying off.”
This story has been updated with additional comment. Want to contribute to this story? Share your addition in comments. | 科技 |
2017-09/1580/en_head.json.gz/4037 | Formation of the parent bodies of the carbonaceous chondrites
J.N. Goswami D. Lal Journal
Abstract We have carried out extensive particle track studies for several C2 chondrites. On the basis of these and the available data on spallogenic stable and radioactive nuclides in several C1 and C2 chondrites, we have constructed a scenario for the precompaction irradiation of these meteorites. We discuss the rather severe constraints which these data place on the events leading to the formation of the parent bodies of the carbonaceous chondrites. Our analyses suggest that the precompaction solar flare and solar wind irradiation of the individual components most probably occurred primarily while the matter had accreted to form swarms of centimeter- to meter-sized bodies. This irradiation occured very early, within a few hundred my of the birth of the solar system; the pressure in the solar system had then dropped below 10 −9 atm. Further, the model assumes that soon after the irradiation of carbonaceous matter as swarms, the small bodies coalesced to form kilometer-sized objects, in time scales of 10 5±1 years, a constraint defined by the low cosmogenic exposure ages of these meteorites. Collisions among these objects led to the formation of much-larger-sized parent bodies of the carbonaceous chondrites. Implicit in this model is the existence of “irradiated” components at all depths in the parent bodies, which formed out of the irradiated swarm material.
Evidence that polycyclic aromatic hydrocarbons in...
on Geochimica et Cosmochimica Act...
Multiple parent bodies of ordinary chondrites
on Earth and Planetary Science Le...
Heterogeneous distribution of solar and cosmogenic...
On the formation of Fe-Ni metal in Renazzo-like ca...
More articles like this.. Formation of the parent bodies of the carbonaceous chondrites MyScienceWork | 科技 |
2017-09/1580/en_head.json.gz/4095 | What To Expect From First Solar In 2017?
Will First Solar’s Series 6 Gamble Pay Off?
India Could Be A Bright Spot For The Global Solar Market In 2017
First Solar Cuts Costs And Alters Its Product Roadmap To Navigate Downturn
Trump Presidency Could Mean A Rough Road Ahead For Solar Stocks
How First Solar Intends To Tackle The Panel Pricing Slump
Key Trends To Watch As First Solar Publishes Q3 Earnings
Why We Think First Solar Is Significantly Undervalued
Clinton, Trump And The Future Of The U.S. Solar Industry
Why We Cut Our Price Estimate For First Solar To $50
First Solar Had A Solid Q2, But Contracting Remained A Mixed Bag
First Solar Q3 Preview: Order Book, GE Partnership And Chinese Projects In Focus
October 29th, 2013 by Trefis Team
First Solar (NASDAQ:FSLR), one of the world’s largest solar companies, is scheduled to report its Q3 2013 earnings on October 31. The company’s results have been somewhat mixed over the last few quarters, given the uneven revenue recognition from the systems business and fluctuating third party module sales. During the second quarter, revenues were down 32% sequentially to around $520 million, while net income dropped to around $33.6 million from about $59 million. [1] For this quarter, we expect revenues and earnings to improve on a sequential basis owing to the possibility of better revenue recognition from the projects business. Financial results aside, there are three key factors that we will be watching when the firm releases its earnings for the quarter.
Trefis has a $43 price estimate for First Solar which is roughly 17% below the current market price.
See Our Complete Analysis For Solar Stocks First Solar | SunPower | Yingli |Trina Solar
1) Expanding The Project Order Book: First Solar’s systems business has been the driving force behind its performance over the last several quarters. While the company has been steadily building its portfolio of projects under development through a series of acquisitions from independent project developers, it has not seen a similar trend in terms of bookings. Although the company’s outlook for the systems business is positive, expecting it to drive total revenues to as high as $4.8 billion by 2015, most of this guidance comes from uncommitted projects. First Solar has also been executing its systems contracts at a faster pace than it has been adding new bookings. As of August 2013, the total backlog of bookings fell to 2.2 gigawatts (GW) from around 2.6 GW in the beginning of the year. While this is somewhat concerning, the firm says that it is has been pursuing several new booking opportunities that amount to around 8 GW. The progress in translating these opportunities into new orders will be a critical factor to watch.
2) Progress On The GE Partnership: During the second quarter, the company announced a technology and commercial partnership with GE (NYSE:GE), under which First Solar will acquire GE’s cadmium telluride (Cd-Te) solar intellectual property and will also begin supplying solar panels to GE for its global deployments. We believe this deal could significantly expand First Solar’s panel shipments, since GE is the world’s largest supplier of power equipment and has relationships with major power plant developers and electric utilities across the world. We will be keen to hear the company’s progress in this collaboration during the conference call.
3) Status Of Chinese Projects: China is expected to become the world’s largest solar market in terms of volumes this year, but First Solar’s presence in this market remains rather limited. Although the company signed a memorandum of understanding with the Chinese government in 2009 to build a 2 GW solar plant in the Inner Mongolia region, the construction activity never really took off. However, during the Q1 2013 conference call, management indicated that the first phase of this project, with a capacity of around 30 megawatts, was finally slated to begin construction during the third quarter of this year, subject to regulatory approvals. We believe that if these projects materialize as planned, they could give the company a much-needed boost in the Chinese market.
Trefis will be updating its model and price estimate for First Solar following the earnings release.
Understand how a company’s products impact its stock price on Trefis
Notes:First Solar Press Release [↩]
I agree that Yieldcos are evidence of a maturing in renewable energy finance. This is a good sign for the industry as a whole. Yieldcos can also a ...
Brett Kuntze
Comment poster, Independent
I guess we will be well content to wait until we are caught unprepared with soaring fossil fuel prices once more again in the near future that will...
, SunPower
, Trina Solar
, General Electric | 科技 |
2017-09/1580/en_head.json.gz/4164 | Steelhead could return to local river
q:Is there a certain type of carpet you would recommend since we have a dog in the house? a:The dog question is one that is revisited every time a family adds a furry friend! While there a...
Environmentally Speaking
By Ron Bottorff Print
August 20, 2008 10:00 p.m.
On July 25, the National Marine Fisheries Service issued a document that could eventually return steelhead salmon to the Santa Clara River.In a document that will have major effects on future operations at the Vern Freeman Diversion facility near Santa Paula by the United Water Conservation District, the National Marine Fisheries Service issued a final biological opinion concluding that future operation of the facility in the proposed manner could jeopardize the existence of the Southern California steelhead.A biological opinion is a technical document written after in-depth study by wildlife agency scientists that reviews the proposed human impacts to an endangered species. The agency then determines whether that species can continue to survive.This biological opinion also laid out a set of actions, termed a “reasonable and prudent alternative,” that United could take to avoid the likelihood of steelhead extinction. This fish was once plentiful in local rivers but is now listed as endangered.The Freeman Diversion is owned by the United States Bureau of Reclamation and operated by United Water. Starting in May 2005, the National Marine Fisheries Service has been in formal consultation with the bureau under Section 7 of the Endangered Species Act on how diversion operations (including the existing fish ladder) affect the steelhead and its critical habitat.The service document was issued as a result of the need for new operational procedures at the diversion. Under the Endangered Species Act, the bureau must consult with the service if facility operation involves impacts to an endangered species that is within the service’s jurisdiction, as is the case for anadromous species that spend portions of their life cycles in the ocean.A fish ladder does currently exist at the facility. But it has not allowed successful passage over the past decades of steelhead migrating upstream from the ocean.The Santa Clara is deemed one of the most important rivers in Southern California for steelhead recovery. The 122-page biological opinion does not specifically define what changes are needed at the facility, but instead calls for convening of a panel of experts to establish interim physical modifications to the facility (to be operational by Dec 21) as well as long-term modifications to be complete by Dec. 31, 2011, when the bureau’s discretion over operation of the diversion lapses.Recovery of steelhead runs in the Santa Clara River has long been a top priority for the Friends of the Santa Clara River. The southern steelhead was listed as endangered about 10 years ago.Since then, there has been a plethora of meetings, discussions, issuance of formal and informal documents and studies. But effective action has not ensued, as evidenced by the fact that only a handful of adult steelhead have been observed in the Santa Clara River in recent years.Friends of the River believes it is time — in fact, way past time — to take the appropriate action.Potential return of the steelhead to their ancient spawning grounds in the Santa Clara River watershed is exciting news that would not have occurred without the Endangered Species Act. Now proposed rule changes could eliminate such progress toward species protection.Under the current regulations, federal agencies must consult with scientists at the Fish and Wildlife Service or the National Marine Fisheries Service to determine whether a project is likely to harm endangered species or habitat.The new regulations would:n Exempt thousands of federal activities from review under the Endangered Species Actn Eliminate checks and balances of independent oversightn Limit which effects can be considered harmfuln Prevent consideration of a project’s contribution to global warmingn Set an inadequate 60-day deadline for wildlife experts to evaluate a project in the instances when they are invited to participate — or else the project gets an automatic green lightn Enable large-scale projects to go without review by dividing them into hundreds of small projectsBecause these regulations are administrative and not legislative, they won’t need the approval of Congress.Friends of the River joins with thousands of other local conservation groups and individuals across the nation in asking President Bush to rescind such inappropriate rule-making and let our independent wildlife scientists do their jobs.Without proper checks and balances, these new rules may simply mean extinction for many of our beautiful and rare plants and animals throughout the United States. Ron Bottorff is chairman of Friends of the Santa Clara River. His column reflects his own views, not necessarily those of The Signal. | 科技 |
2017-09/1580/en_head.json.gz/4183 | Doug RichAmateur astronomer Doug Rich of Hampden has discovered 22 supernovas in the last nine years, including this one in 2005. By Tom Walsh, BDN Staff
Posted Dec. 05, 2012, at 3 p.m.
STEUBEN, Maine — You could say that Doug Rich of Hampden has spent most of his professional life studying the sky.
A retired air traffic controller, Rich is also an accomplished amateur astronomer. Over the last nine years of stargazing, he has discovered 22 previously undetected supernovae — exploding stars — in a wide range of galaxies. His latest find was logged just a few weeks ago.
“A supernova is the explosive death of a star,” Rich said Wednesday. “Nine years ago I read an article in a telescope magazine about an amateur in Australia who was doing the same thing. I do these searches from home, using a 16-inch, robotic reflector telescope, which is linked to a computer. I have the computer screen image remoted to my living room so that I can watch TV while keeping an eye on things.”
The computer’s software, he said, automatically searches from a list of galaxies for new objects that weren’t at the same point in the sky on previous scans. Rich said he reviews the overnight images the next day and compares them with earlier images.
“What I’m seeing as a new object may be a lot of things,” Rich said. “It may be ‘noise’ from the camera, or it may be an asteroid. I eliminate the possibilities by taking a confirmation image.”
Once Rich confirms a new discovery, he notifies the Massachusetts-based Central Bureau for Astronomical Telegrams. New sightings are named in order of their discovery.
“The first new discovery in 2013 would be ‘2013A’ and will continue through ‘2013Z’ and then go on to ‘AA,’” he said. “Astronomers or scientists who do any kind of report on their discoveries are given credit.”
Rich is well aware that dark skies that are not polluted by man-made light are becoming rarer every day. The Down East skies and those above Baxter State Park are among the least light-polluted in Maine, he said.
“Dark skies are going fast,” Rich said. “Hampden used to be really dark, but that’s changed with urban sprawl. I live on a dead-end road, but a developer put up three houses near me. When I’m doing my telescope surveys, I sometimes have to ask my neighbors to turn off any lights they don’t need.”
Rich is teaming up with the Eagle Hill Institute in the Washington County community of Steuben, where this weekend he will be giving an hour-long lecture on supernovae and the techniques and technologies he uses in discovering them. The event, at 5:30 p.m. Saturday, Dec. 8, is free and open to the public. Those attending will have the option of placing an order before the lecture for an individual-size gourmet pizza to be served after the event. The cost of the pizza is $10 and includes a beverage and dessert ice cream.
The 150-acre Eagle Hill Institute, which is located in Washington County off the Dyer Bay Road in Steuben, is in the process of launching a new astronomy initiative that will include construction of a five-story observation tower located on a hilltop that offers an unobstructed 360-degree panorama of the sky. The nonprofit institute has no firm timeline for building its proposed observatory facilities, but is beginning to ramp up fundraising for the project.
For information about the Institute’s astronomy effort and Rich’s upcoming lecture, call 546-2821.
http://bangordailynews.com/2012/12/05/news/steuben-institute-showcasing-hampden-astronomers-hunt-for-exploding-stars/ printed on February 22, 2017 | 科技 |
2017-09/1580/en_head.json.gz/4390 | বুধবার, ২৯ মে, ২০১৩
Accessory Turns iPhone Into High-Tech Lab
A new biosensing tool puts the power of a high tech laboratory in the pockets of researchers in the field. This iPhone-enabled device could be used in pop-up clinics, waste management sites, refugee camps and anywhere else the mobile testing of biological materials such as blood is necessary. Developed by researchers at the University of Illinois Urbana-Champaign, the tool consists of an iPhone cradle and an accompanying app. Although the cradle only holds about $200 worth of optical components, it performs as accurately as a $50,000 spectrophotometer. The wedge-shaped cradle keeps the iPhone's camera aligned with a series of lenses and filaments, which are used to measure light. And at the heart of the device's biosensing capabilities is a simple microscope slide coated with a photonic material, which reflects only one wavelength of light and allows the rest of the spectrum to pass through the slide. When a biological substance ? like protein or DNA ? attaches itself to the photonic crystal slide, the color that is reflected will shift from a shorter to longer wavelength. The size of this shift depends on the amount of the substance present on the slide. So to test for the presence of a protein, for example, researchers would first prime the slide with the protein they want to test for, insert the slide into the cradle and then use the app to measure the length of the protein's reflected wavelength. This gives them a base measurement. They can then expose the slide to a field sample containing the protein and remeasure the wavelength. The shift in the reflected wavelength tells the researchers how much of the protein is present in the sample. The test takes only a few minutes to complete, and the app walks the user through the entire process. This means that samples that once had to be sent back to a lab for analysis can now be tested on-the-spot in a matter of minutes. Researchers have high hopes for this device as a viable option for doctors, nurses, scientists and others in need of efficient, affordable biological testing on the go. They think it could be particularly useful in developing countries, where resources are limited and laboratories are scarce. With help from the iPhone's GPS tool, the device could even be used to track groundwater contamination, map the spread of pathogens or monitor contaminant checks in the food processing and distribution chain, according to researchers. "We're interested in biodetection that needs to be performed outside of the laboratory," said Brian Cunningham, professor of electrical and computer engineering at the University of Illinois. "Smartphones are making a big impact on our society- the way we get our information, the way we communicate. And they have really powerful computing capability and imaging." The team is now working on a similar cradle and app solution to provide mobile testing on Android devices. And a recent grant from the National Science Foundation is enabling researchers to expand the range of biological experiments that can be performed with the tool, including tests that will detect toxins in harvested corn and soybeans, as well as tests to detect pathogens in food and water. The use of smartphones as a platform for detecting environmental phenomena seems to be a growing trend, not only among researchers, but consumers as well. In the past year, a number of smartphone-enabled tools and accompanying apps have come on the market that measure everything from the "organic-ness" of fruits to the temperature of soil in a vegetable garden. Lapka, an iPhone device that includes a Geiger counter and electromagnetic sensors, is marketed as a tool for better understanding and exploring the environment. This high tech toy is turning heads as it takes science out of the lab and puts it in the hands of the everyday consumer. Cunningham and his team believe that their device can also be used by individuals, not for analyzing their personal space, but for monitoring their health. "A lot of medical conditions might be monitored very inexpensively and non-invasively using mobile platforms like phones," said Cunningham. "[Smartphones] can detect a lot of things, like pathogens, disease biomarkers or DNA, things that are currently only done in big diagnostic labs with lots of expense and large volumes of blood." One such possible application, according to researchers, is testing children and pregnant women for Vitamin A and iron deficiencies. This story was provided by TechNewsDaily, a sister site to LiveScience. Email asklizzyp@gmail.com?or follow her?@techEpalermo. Follow us @TechNewsDaily, on Facebook or on?Google+. Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Source: http://news.yahoo.com/accessory-turns-iphone-high-tech-lab-132955699.htmlCinco De Mayo History lindsay lohan bob newhart chris kelly Mayweather Fight Mayweather Robert Guerrero এর দ্বারা পোস্ট করা
Heat's Birdman suspended for Saturday's Game 6
Evacuations called off as fire north of LA calms
Scientists Revived 400-Year-Old Plants That Could ...
McDonald's CEO Insists: We Don't Sell Junk Food
London's V&A Museum names Sophia George as fir...
Decontaminating patients cuts hospital infections
Galaxy Note III to Pack a Snapdragon 800 Processor...
PFT: RG3 would play Week 1 minus preseaosn
Wedding dress creator
The man who no longer matters
Autism vaccine scare leads to measles epidemic in ...
Highlights of Today?s IRS Hearing (Powerlineblog)
Book Review : Robot Futures by Illah Reza Nourbak...
The Multiple Facets of the IRS Scandal (Powerlineb...
Under Armour Armour39 Review: Beast Mode Unlocked
Gilles Jacob: Cannes: Film Festival or Festival of...
Chris Hadfield: Space Chef In Chief | 科技 |
2017-09/1580/en_head.json.gz/4561 | Climate change posing multiple challenges for Pakistan
ISLAMABAD: Experts on Monday said that South Asia in general and Pakistan in particular were confronting multiple challenges due to climate change and water scarcity issues.The experts expressed these views at a workshop, 'Climate Change', organised by Individualland Pakistan (IL) in collaboration with Friedrich Naumann Foundation. They urged the governments in the region to take appropriate measures as early as possible.The experts pointed out that half of the population in Asia was facing different disasters in the shape of earthquake, tsunami, flood and other natural calamities because of the changing climate.They said the 2010 and 2011 floods in Pakistan were due to change in climate. Low proportion of forest-covered area in Pakistan was the main cause of floods, they said, adding that the country had less than three percent of its area covered under forests. "Deforestation is on the rise. Riverbeds are usually occupied by people. There is no regulatory body to properly monitor the riverbeds. Flood water can be used for the benefit of people, but due to inefficient management, it turns into disaster," the speakers at the conference said.The participants said that changes in rain and snowfall patterns over the past few years, rising temperatures and ever-expanding human settlements were resulting in changes that were disturbing the natural climatic patterns.A number of scientists and policy researchers argue that climate changes are natural and have occurred in the past as well, while there are others who say these changes are being caused by human activities like deforestation and industrialisation.End. | 科技 |
2017-09/1580/en_head.json.gz/4607 | The Roadmaps towards Sustainable Energy futures (RoSE) project aims to provide a robust picture on energy sector transformation scenarios for reaching ambitious climate targets. A broad and systematic exploration of decarbonization scenarios for the energy system is indispensable for better understanding the prospects of achieving long-tern climate protection targets. RoSE is assessing the feasibility and costs of climate mitigation goals across different models, different policy regimes, and different reference assumptions relating to future population growth, economic development and fossil fuel availability, in order to provide vital insights into the overarching policy question: What are robust roadmaps for achieving a sustainable global energy future?Funded by Stiftung Mercator, coordinated by the Potsdam Institute for Climate Impact Research (Project chair: Prof. Ottmar Edenhofer; Project director: Dr Elmar Kriegler), and with the participation of 5 leading integrated assessment modeling teams from the EU, U.S. and China, and 4 domain experts on energy security, energy access, fossil resources and the transport sector, the project is running over a period of three years (2010-2012).RoSE aims to produce a large scenario data base and a series of research papers that can serve as a key input to international climate policy assessments, like the IPCC 5th Assessment Report.
Funded by:
Coordinated by:
© 2017 RoSE Project. All rights reserved. Created by S.C.S. KnowHow Ltd. | 科技 |
2017-09/1580/en_head.json.gz/4626 | Headlines > News > NASA's Marshall Center Completes Wind Tunnel Testing for Sierra Nevada Corporation's Dream Chaser Space System
NASA's Marshall Center Completes Wind Tunnel Testing for Sierra Nevada Corporation's Dream Chaser Space System
Published by Klaus Schmidt on Tue May 15, 2012 5:38 am via: NASA
NASA, Sierra Nevada Corporation, Dream Chaser
HUNTSVILLE, Ala. – NASA’s Marshall Space Flight Center in Huntsville, Ala., successfully completed wind tunnel testing for Sierra Nevada Corp. (SNC) Space Systems of Louisville, Colo. The test will provide aerodynamic data that will aid in the design of the new Dream Chaser Space System. During tests at Marshall’s wind tunnel facility, a scale model of SNC’s Dream Chaser orbital crew vehicle was mounted on a scale model of the United Launch Alliance’s Atlas V launch vehicle. Over 400 data runs were performed at subsonic, transonic and supersonic speeds to study the effects of how air moves past the model. Nine full-stack configurations were tested over a Mach range of .4, or 304 miles per hour at sea level, to Mach 5, or 3,800 miles per hour at sea level, at various launch vehicle roll angles. The data generated from this test series, coupled with data from computational fluid dynamics studies, will define the aerodynamic characteristics of the Dream Chaser – Atlas V launch stack during the ascent phase of flight. Obtaining this data will enable higher-fidelity loads analysis, better definition of launch vehicle performance, and will aid in further refining Dream Chaser’s trajectory design for orbital vehicle launches.
“We’re glad Marshall could support SNC in completing these wind tunnel tests quickly and affordably and early in the design phase,” said Teresa Vanhooser, manager of the Flight Programs and Partnerships Office at Marshall. “Our trisonic wind tunnel and engineering staff helps partners understand the aerodynamic integrity and stability of spacecraft and launch vehicles, like the Dream Chaser, over a variety of wind speeds and phases of flight.”
Mark Sirangelo, corporate vice president and head of SNC’s Space Systems, said: “The Dream Chaser Program is grateful for the opportunity to leverage the experience, expertise, and resources of Marshall, made possible by the unique government-commercial partnership created through NASA’s Commercial Crew Development Program. Sierra Nevada Corporation looks forward to expanding our successful relationship with Marshall, as well as creating new business opportunities in the Huntsville area.”
Marshall’s Aerodynamic Research Facility’s 14-inch trisonic wind tunnel is an intermittent, blow-down tunnel that operates from high-pressure storage to either vacuum or atmospheric exhaust. The facility is capable of conducting tests in the subsonic, transonic and supersonic mach ranges using its two interchangeable test sections. Subsonic Mach numbers are below Mach 1, the speed of sound, or 760 miles per hour at sea level, while transonic speeds approach and are slightly above Mach 1. The facility can achieve a maximum supersonic Mach number of 5, or five times the speed of sound.
SNC is currently one of the NASA Commercial Crew Development (CCDev) partners awarded funding under a Space Act Agreement to mature their Dream Chaser orbital crew transportation system. NASA’s CCDev effort is being led by NASA’s Kennedy Space Center and supported by NASA technical experts across the agency, including the Marshall Center for a variety of technical areas.
The effort to define the aerodynamic characteristics of the Dream Chaser Space System is being conducted under a reimbursable Space Act Agreement funded by SNC and executed with the support of aerodynamicists and wind tunnel experts from the Marshall Center and United Launch Alliance.
NASA Modifies Launch Service Contract To Add Falcon 9 Rocket
Amateur astronomers boost ESA’s asteroid hunt | 科技 |
2017-09/1580/en_head.json.gz/4706 | Experiments On Life After Death By Marcelo Gleiser
May 18, 2011 TweetShareGoogle+Email Since my co-blogger Adam Frank posted yesterday that hilarious Monty Python video examining whether there is life after death, and Mark Memmott of The Two-Way blog wrote Monday on Hawking's pronouncements on the same topic quoting me, I couldn't resist contributing to this valuable debate with a few remarks on life-after-death experiments. I quote from my book A Tear at the Edge of Creation, where I described both my teenage fantasy of measuring the weight of the soul and a "serious" attempt from early in the twentieth century that got a lot of press at the time: Reading Frankenstein as a teenager incited even more my fantasy of becoming a Victorian natural philosopher lost in the late twentieth century. When I joined the physics department at the Catholic University at Rio in 1979, I was the perfect incarnation of the Romantic scientist, beard, pipe and all. I remember, to my embarrassment, my experiment to "investigate the existence of the soul." If there was a soul, I reasoned, it had to have some sort of electromagnetic nature so as to be able to animate the brain. Well, what if I convinced a medical facility to let me surround a dying patient with instruments capable of measuring electromagnetic activity, voltmeters, magnetometers, etc.? Would I be able to detect the cessation of life's imbalance, the arrival of death's final equilibrium? Of course, the instruments had to be extremely sensitive so as to capture any minute change right at the moment of death. Also, for good measure, the dying patient should be on a very accurate scale, in case the soul had some weight. I remember explaining my idea to a professor [...] I can't remember exactly what he said, but I do remember his expression of muted incredulity. Of course, I was only half serious in my excursion into "experimental theology." But my crackpot Victorian half, I am happy to say, had at least one predecessor. In 1907, Dr. Duncan MacDougall of Haverhill, Massachusetts, conducted a series of experiments to weigh the soul. Although his methodology was highly suspicious, his results were quoted in The New York Times: "Soul has weight, physician thinks," read the headline. The weight came out at three quarters of an ounce (21.3 grams), albeit there were variations among the good doctor's handful of dying patients. For his control group, MacDougall weighed fifteen dying dogs and showed that there was no weight loss at the moment of death. The result did not surprise him. After all, only humans had souls. Those interested in more details of this and other stories, should read Mary Roach's hilarious and informative Stiff: The Curious Lives of Human Cadavers and consult this site. Dr. MacDougall's measurements inspired the 2003 Hollywood hit movie 21 Grams, which featured Sean Penn playing the role of an ailing mathematician. Back to Hawking, I must agree with him. Although from a strictly scientific viewpoint we haven't proven that there is no life after death, everything that we know about how nature works indicates that life is an emergent biochemical phenomenon that has a beginning and an end. From a scientific perspective, life after death doesn't make sense: there is life, a state when an organism is actively interacting with its environment, and there is death, when this interaction becomes passive. (Even viruses can only truly be considered alive when inside a host cell. But that's really not what we are taking about here, which is human life after death.) We may hope for more, and it's quite understandably that many of us would, but our focus should be on the here and now, not on the beyond. It's what we do while we are alive that matters. Beyond life there is only memories for those who remain. Copyright 2011 National Public Radio. To see more, visit http://www.npr.org/.TweetShareGoogle+EmailView the discussion thread. © 2017 WEKU | 科技 |