Updates from September, 2017 Toggle Comment Threads | Keyboard Shortcuts

  • jkabtech 8:17 pm on September 30, 2017 Permalink |
    Tags: , ,   

    Roy Hill turns to Azure, SAP for IoT insights 

    Miner’s computer systems take shape.

    Gina Rinehart’s Roy Hill mine has spent the past year co-funding and piloting a data science platform created using Microsoft Azure components including the Azure IoT hub.

    The iron ore miner is the first user of a ‘business analytics’ platform and capability that was co-developed and funded by systems integrator Ajilon Australia alongside Roy Hill.

    It is understood the system is being mostly used to analyse sensor data from heavy mobile equipment (HME) and locomotives used to haul ore from pit to port.

    Typically, the data is being aggregated at Roy Hill – such as in a data historian or ‘data lake’ – and then fed through the business analytics system and up into Azure.

    However, the miner is understood to be also testing analytics on data that is streamed from the field in as near to real-time as is feasible. Ajilon’s national solution lead Peter Hawkins told iTnews this wasn’t yet a “production use case but it’s working”.

    Hawkins said that around the same time as Ajilon began building out an analytics practice in 2015, it learned of Roy Hill’s interest in trialling similar technology.

    The two partnered, and worked closely with Microsoft Azure engineers in Redmond to create a “templated, repeatable” analytics platform based on Microsoft components.

    “We had a direct line into Redmond and were able to call on their engineers building this stuff to resolve some of the challenges

    View the Original article

  • jkabtech 12:17 pm on September 30, 2017 Permalink |
    Tags: , , ,   

    Avaya to sell networking business for $131 million 

    Extreme Networks emerges as stalking horse bidder.

    Avaya is set to sell its networking business to Extreme Networks for US$100 million (A$131.7 million) as it restructures to stave off bankruptcy.

    The vendor – which filed for chapter 11 bankruptcy protection in January to reorganise its business – got into the networking market after buying a slice of Nortel in 2009 under similar circumstances.

    Avaya said other “interested parties” could still bid for its networking business, and if they did Extreme’s offer would “set the floor value” in an auction process.

    Any purchase of the assets is expected to close by June 30 this year.

    “As the stalking horse bidder, Extreme will be entitled to a break-up fee and expense reimbursement, if it ultimately does not prevail as the successful bidder at the required auction for Avaya’s assets,” Extreme Networks said in a statement.

    Extreme CEO Ed Meyercord said he expected Avaya’s networking business “to generate over US$200 million in annual revenue”.

    He said it would increase the company’s market share and “offer new opportunities for customers”.

    Avaya’s A/NZ managing director Peter Chidiac sought to reassure customers and partners in a separate statement.

    “While we understand this announcement may cause uncertainty in the market, we want to assure our Australian and New Zealand customers and partners there will be no change to the way we interact with and support them during the sale and ultimate transition process,” he said.

    View the Original article

  • jkabtech 4:17 am on September 30, 2017 Permalink |
    Tags: , , , , , ,   

    Melbourne Uni turns to wi-fi to limit students to their own faculties 

    Micro DC panel discussion at Cloud & DC Edge 2017. Reveals ‘edge’ processing plans and projects.

    The University of Melbourne is considering a plan to dissuade students from congregating in libraries outside their areas of study by limiting their access to campus wi-fi in those places.

    It is understood that undisclosed faculties have raised concerns that their students are unable to access faculty library resources due to other students using the spaces.

    The university is hoping to tap its campus-wide wi-fi network – which has about 5000 access points – to identify the study areas of students entering or sat in faculty libraries “on-the-fly”.

    Students entering a faculty library not related to their area of study could find their wi-fi access limited as capacity is prioritised to students actually enrolled in that faculty’s studies.

    “Deans of some faculties are not happy that their libraries are being filled with all sorts of students who aren’t necessarily in their faculty, so their own students are missing out,” data centre and facility services manager Will Belcher told the Cloud & DC Edge Summit.

    “They want to try and use the data from the wireless access points and controllers – we track

    View the Original article

  • jkabtech 8:17 pm on September 29, 2017 Permalink |
    Tags: , ,   

    Huawei replaces Australian CEO 

    Says strategy won’t change.

    Huawei Australia will replace its CEO James (Xichu) Zhao with immediate effect as part of a global policy to rotate its in-country executives.

    The vendor said Zhao nwould be replaced by Aragon (Xiaojie) Meng, a 12-year Huawei employee who has previously held high-level roles in Europe and Pakistan.

    Meng indicated the vendor’s strategic direction would remain the same despite the leadership change.

    “We will persist on our 4.5G and 5G journey and keep delivering innovative NB-IoT solutions for the energy, utilities, transportation and agriculture industries,” Meng said in a statement.

    “We will help the local telecommunication market to become more efficient, cost effective and create innovative solutions for telecom operators.”

    Meng also indicated a focus on building consumer brand awareness as the company continues to push for greater market share of its flagship handsets.

    Zhao had been Australian CEO for three years. 

    View the Original article

  • jkabtech 12:17 pm on September 28, 2017 Permalink |
    Tags: , , ,   

    Riverbed buys Xirrus to reach the network edge 

    Strategic play at future of networking.

    Network management vendor Riverbed Technology will take over wireless provider Xirrus in order to be able to offer solutions spanning the full spectrum of a customer’s network.

    The acquisition will add an enterprise-grade wi-fi product to Riverbed’s SteelConnect SD-WAN offering and extend it to the wireless network edge.

    Riverbed chief Jerry M Kennelly said the enterprise network was becoming more difficult for IT departments to manage given the rise of “digital, cloud, and mobile”.

    “A fundamental rethink to networking is required and with this acquisition, Riverbed and our partners are uniquely positioned to provide CIOs and businesses with a software-defined networking approach that delivers unified connectivity and orchestration across the entire network,” he said in a statement.

    Riverbed sees being able to offer enterprises the full range of network solutions as a strategic play aimed at what it sees as a move away from configuring boxes and towards “policy, orchestration and automation”.

    “By combining the advanced wi-fi capabilities of Xirrus and SteelConnect’s intuitive and powerful orchestration, we’re taking a bold step to bring the power of policy-based network management out to the wireless edge,” Paul O’Farrell, senior vice president of the Riverbed SteelConnect, SteelHead and SteelFusion business unit, said in a statement.

    The company did not disclose financial terms of the deal. It is expected to close in April.

    Riverbed said Xirrus would continue to be available as a standalone WLAN solution.

    Its acquisition of Xirrus is the latest in a string of networking provider takeovers consolidating the sector.

    HPE acquired Aruba Networks in 2015 for A$3.5 billion, Fortinet took over Meru Networks in the same year, and Brocade bought Ruckus Wireless last year for A$2 billion only to sell it off in February to Arris.

    View the Original article

  • jkabtech 4:17 am on September 28, 2017 Permalink |
    Tags: , , , , ,   

    Fibre, wi-fi keep most Shell Prelude staff onshore 

    Flying workers out to floating platform an “exception” rather than rule.

    Implementing a subsea fibre optic link to underpin automation and remote monitoring aboard Shell Australia’s forthcoming Prelude floating liquefied natural gas (FLNG) platform means it will only need to fly people out to the plant “by exception”.

    The energy giant joined fellow LNG operator Inpex in 2014 as a foundation customer for a then-new subsea cable between Darwin and Port Hedland.

    The 2000km, $100 million cable was built by Alcatel-Lucent and managed by Nextgen Networks, which is now part of Vocus.

    The Prelude project is being closely watched by the LNG sector as a potential model for future gas extraction from increasingly remote offshore fields.

    Speaking at this month’s Gastech conference in Tokyo, Shell Australia’s vice president of production, David Bird, said Prelude’s systems would mean flying staff out to the remote floating platform “only by exception”.

    The enormous Prelude platform is being equipped with process control, monitoring and automation technology by Emerson. It is also fitted with communications and entertainment systems by Alcatel-Lucent, and both of these are backed by the subsea cable.

    Prelude’s operations will be monitored remotely from Shell Australia’s collaborative work environment (CWE) in Perth, which acts as the company’s main operations centre.

    Shell has set up other CWEs worldwide which provide similar remote support for offshore operations.

    “It’s incumbent on all of us that we minimise

    View the Original article

  • jkabtech 8:17 pm on September 27, 2017 Permalink |
    Tags: , , , ,   

    NBN Co hits 1Gbps+ speeds in fixed wireless trials 

    Expands footprint by 100,000.

    NBN Co has demonstrated that it can push peak broadband download speeds on its fixed wireless network from 50Mbps to over 1Gbps.

    The company has been conducting a trial of the technology at a site in Ballarat; the same location from which it announced plans to accelerate peak speeds on its fixed wireless network from 50Mbps to 100Mbps.

    NBN Co achieved maximum peak speeds of 1.1Gbps/165Mbps in the trial. It also demonstrated that intermediate speeds of 400/55Mbps and 250/50Mbps were possible.

    The 1Gbps

    View the Original article

  • jkabtech 12:17 pm on September 27, 2017 Permalink |
    Tags: , , ,   

    Sky TV NZ creates hybrid OpenStack-AWS environment 

    Uses sprints to bed down OpenStack.

    Sky Network Television NZ has set up a hybrid cloud environment using AWS and OpenStack that has cut the time taken to deploy new projects and make changes to its products.

    Created initially with the TV provider’s coders in mind, the environment can also be used by other business functions, which can spin up instances via a self-service portal.

    The project was initiated when developers were asked to create an online video player during last year’s Rio Olympics.

    Though Sky had an existing on-demand player called SkyGo, it backed onto on-premises physical and virtual hardware that the team believed would not scale or reach the performance benchmarks required, and was therefore re-architected to run on cloud instead.

    “We ended up going into AWS initially which turned out pretty well because the way we approached it was well-architected and there was buy-in from all the different stakeholders – including security and networks – right at the start,” infrastructure architect Jean-Pierre Senekal told the OpenStack Summit in Boston.

    “The way we did it was to template everything and automate it.”

    Senekal said Sky then decided it wanted to “transfer that capability that we had in the

    View the Original article

  • jkabtech 4:17 am on September 27, 2017 Permalink |
    Tags: , , , ,   

    Google opens cloud platform in Sydney 

    Long-awaited launch.

    Google has opened the new Sydney region for its cloud platform to customers, nine months after revealing plans to host infrastructure in Australia.

    The web giant today launched three local zones for high availability – as planned – and with services spanning compute, storage, networking and big data.

    PwC Australia, Monash University and Woodside Energy were among enterprise customers that welcomed the arrival of the local presence.

    Woodside revealed at the end of last month that it was working with Google to push the limits of the cloud service as it looks to improve the speed at which it can derive insights from seismic data.

    It is unclear whether it is using the local availability zone; the company today simply said Google cloud platform remained an “on-demand solution” for supercomputing resources.

    PwC Australia’s backing of the local arrival of Google cloud platform points to a potential expansion of the public cloud services used in its “cloud-only” IT infrastructure.

    iTnews revealed in March that PwC had adopted Google’s G-suite for workforce productivity, but the public cloud portion of its outsourced infrastructure was “mostly Amazon and a couple of small workloads in Azure”.

    “The regional expansion of Google cloud platform to Australia will help enable PwC’s rapidly growing need to experiment and innovate and will further extend our work with Google cloud,” PwC Australia CIO Hilda Clune said in a statement.

    Google said it had “thousands” of customers in Australia already using some of its cloud services.

    Since revealing plans in September last year to host its cloud platform in Sydney, the web giant has remained tight-lipped on when it would launch, and with how many services.

    Google said tests had already confirmed significant latency improvements for existing cloud customers that had been hosting workloads in other availability zones and regions, such as Asia or the United States.

    It promised further improvements once the Indigo cable system connecting Perth and Singapore is live in 2019. Google is part of a consortium of companies that has taken over the cable project.

    For now, traffic runs in and out via several east coast paths to Asia and beyond.

    Google said it had appointed Shine Solutions, Servian, 3WKS, Axalon, Onigroup, PwC, Deloitte, Glintech, Fronde and Megaport as certified GCP partners.

    View the Original article

  • jkabtech 8:17 pm on September 26, 2017 Permalink |
    Tags: , ,   

    Equinix to fit out remaining part of SY4 data centre 

    Stumps up $55 million.

    Equinix will invest US$42 million (A$54.7 million) into the phase two expansion of its SY4 data centre in Sydney’s south.

    SY4, which is located in the suburb of Alexandria, was announced back in April 2015. The build-out was split into two stages, each with a capacity of 1500 cabinets.

    The company officially brought the first 1500 cabinets online in August last year when it opened the facility, at a cost of US$97 million.

    Now it has announced plans to deploy the remaining 1500 cabinets in the space, bringing the total capacity up to that which it had initially planned for – 3000 cabinets and around 12,500 square metres of usable floor space.

    The operator said demand for cloud services continued to be a driver for expansion.

    Equinix – like others in the space – has established itself up as a major point of interconnection between third-party clouds and an enterprise’s own systems.

    The company’s Australian managing director Jeremy Deutsch said it supported “more than 100 local and multinational companies in SY4.”

    He said the commitment to fit out the remaining portion of the building would enable more organisations to gain “direct, private access to the leading cloud providers, as well as many specific cloud services” as part of hybrid cloud strategies.

    The investment comes as Equinix continues to dominate data centre rankings published by Cloudscene, a data centre directory business owned by entrepreneur Bevan Slattery.

    Equinix was ranked as the top operator across all four geographies tracked by Cloudscene, which covers North America, EMEA, Asia and Oceania, the latter of which encompasses Australia.

    View the Original article

  • jkabtech 12:17 pm on September 26, 2017 Permalink |
    Tags: , , ,   

    Govt to review Australian space sector 

    Meet the academic and industry leaders overseeing the process.

    The government is set to review Australia’s space industry with a view to creating a strategy to support its growth over the course of the next decade.

    The capability review will be led by an expert review group chaired by former CSIRO chief executive Dr Megan Clark.

    Also advising the review will be:

    UNSW Canberra’s chair for space engineering Professor Russell Boyce, who is spearheading a $10m push to “fly affordable, responsible in-orbit missions” using cubesats to test and develop “innovative new technologies for spacecraft”. The uni went to market for seven engineers earlier this year and claims to have “the largest space capability in Australia.”Michael Davis, who chairs the Space Industry Association of Australia (SIAA). His organisation expressed disappointment earlier this year when the federal budget saw no money allocated to a civil space program for Australia – something the SIAA has been pressing for, particularly as Australia grows its credentials as a hub for cubesat development.
    “The SIAA appreciates that its proposals constitute a rethinking of the governmental structures required for the administration and oversight of a permanent national space program,” the SIAA said. “We will continue to advocate for the establishment of an internationally recognised national space agency as a fundamental first step in a strategy to build on our scientific and industrial capabilities.”Dr David Williams, who is presently a CSIRO director with executive oversight of areas including astronomy and space science, the Australian Telescope National Facility, and Data61. He was previously chief executive of the United Kingdom Space Agency.Dr Stuart Minchin, who heads environmental geoscience at Geoscience Australia and has a strong interest in Earth observation and monitoring.Professor Steven Freeland, who is dean of the school of law at Western Sydney University and has a strong background in space law and policy development.Professor Anna Moore, who is the director of ANU’s advanced instrumentation and technology centre (AITC). She has built major instruments used in observatories worldwide, including in Australia, Japan and the United States.Dr Jason Held, who is the director and founder of Saber Astronautics, which has operations in Sydney. He was previously a US Army major and Army space support team leader for USSTRATCOM (formerly Space Command), and has worked on major projects including the Hubble space telescope. He’s also the founder of the Delta-V space accelerator which supports aerospace start-ups in Sydney.Flavia Tata Nardini, co-founder and CEO of Adelaide’s Fleet Space Technologies. Last month she wrote an open letter to the government asking for a “dedicated Australian space agency”. She began her career at the European Space Agency as a propulsion test engineer.

    The review will begin this month and is expected to be completed by the end of March 2018. It is expected to consult with “key stakeholders and state jurisdictions”, among others.

    Minister for Industry, Innovation and Science, Arthur Sinodinos, said the review “will lead to a national strategy for the space sector that reflects both our developing strengths and national interests over the next decade”.

    “Ensuring that the right strategic framework is in place to support the growth of Australian’s space industry will be core to the review process,” he said in a statement.

    “The Australian government wants to ensure the right framework and mix of incentives are in place to assist Australia’s growing space industry sector to participate successfully in this global market.”

    View the Original article

    • Hannah Barron 5:46 pm on January 21, 2018 Permalink | Log in to Reply

      I do believe all of the ideas you have introduced for your post. They’re really convincing and can definitely work. Still, the posts are very short for beginners. Could you please extend them a little from subsequent time? Thank you for the post.


  • jkabtech 4:17 am on September 26, 2017 Permalink |
    Tags: , , , ,   

    NextDC looks to take back data centre building ownership 

    Launches takeover for real estate trust.

    NextDC has launched a bid to buy back the property group it set up in late 2012 to own the land and buildings used for its first three data centres.

    The company has been trying to fend off an investment firm’s efforts to take over Asia Pacific Data Centre Group, whose sole assets are buildings housing NextDC’s Sydney1, Melbourne1 and Perth1 data centres.

    NextDC created APDC in December 2012 and listed the $207 million property trust early in 2013. The data centre operator then took long-term leases on the buildings, ensuring APDC could promise investors an “attractive yield” from a “single landlord and tenant arrangement”.

    APDC was essentially an exercise in capital recycling for NextDC, a way of raising money that could be re-invested in data centre fitout and expansion. NextDC sold down its majority stake in APDC later in 2013.

    In May this year, 360 Capital Group took a 19.8 percent stake in APDC at a cost of $35.8 million, making it the largest single security holder. It has since made a conditional offer to buy out the rest of the company.

    NextDC countered last week by re-taking a 14.1 percent stake back in APDC for $29 million, and has now launched its own conditional offer to take over the data centre trust.

    The company said it would offer $1.85 per security, funded entirely from its cash reserves.

    “In 2015, we advised the market of our change in strategy to own more of our data centre properties over the longer term when we announced that NextDC would proceed to develop and own the new data centres for Brisbane (B2) and Melbourne (M2),” NextDC CEO Craig Scroggie said in a market filing.

    Scroggie said that taking over APDC – and therefore the ownership of its earlier data centres – represented a “low risk acquisition” for the company.

    The battle over APDC has become increasingly hostile as NextDC and 360 Capital Group traded accusations.

    Financial analysts reacted positively to news that NextDC would launch a bid for APDC.

    View the Original article

  • jkabtech 8:17 pm on September 25, 2017 Permalink |
    Tags: , , , , ,   

    Cisco deletes Meraki customer data in config bungle 

    Unsure what info is lost.

    Network equipment giant Cisco has owned up to an embarrassing blunder for its Meraki management platform that has led to customer data being deleted in the service offering’s cloud storage.

    Cisco said its Meraki engineers made a configuration change error on the North American object storage service.

    “On August 3rd, 2017, our engineering team made a configuration change that applied an erroneous policy to our North American object storage service and caused certain data uploaded prior to 11:20AM Pacific time on August 3 to be deleted,” Cisco said.

    Lost data includes customer Meraki dashboard custom splash page themes and organisation logos, floor plans, and device placement photos.

    Custom enterprise apps in the Cisco Meraki System Manager have also been deleted, as have user voice menus, music on hold, contact images and voice mail greetings.

    Customers are advised to wait while Cisco engineers work to rectify the error before uploading new data to the Meraki service.

    The company said it is working out what tools it can build to help customers identify the data they have lost from the Cisco Meraki cloud.

    Cisco’s Meraki service is a cloud-based management platform for networking devices, security cameras, and mobility as well as security appliances.

    The Westfield chain of shopping malls and Accor Hotels are among Cisco Meraki customers.

    View the Original article

  • jkabtech 12:17 pm on September 25, 2017 Permalink |
    Tags: , Canberra, , ,   

    Microsoft to launch Azure in Canberra in push for sensitive govt data 

    Greg Boorer (Canberra Data Centres) and James Kavanagh (Microsoft) Hyperscale data centres hoping for ASD approval. Microsoft is launching two Canberra regions for its Azure platform in a major investment to convince Australian government agencies to move their data to its public cloud.

    The company will go live with the new regions, based in Canberra Data Centres’ Hume and Fyshwick facilities, in the first half of 2018.These regions will join Microsoft’s existing Australian public cloud data centres in Sydney and Melbourne, taking the total number of global Azure regions to 42.

    The move also steals a march on Azure’s biggest competitor, Amazon Web Services, which does not have any data centres in Canberra, though recently announced Direct Connect through NextDC.

    The two Azure regions – dubbed Australian Central 1 and 2 – will be plumbed into the ICON network that connects all Australian government agencies.

    “Less than three years ago, we launched our first cloud services from an Australian data centre. Since then, we’ve worked to lead the way in delivering trusted innovation to our customers and partners, which has at times meant the hard work of undertaking very onerous compliance processes,” James Kavanagh, Microsoft Azure engineering lead for Australia, said.

    “We’ve taken on that effort to reduce the work our partners and customers must do themselves, thus accelerating their ability to adopt innovation.

    “This has led the Australian Signals Directorate to certify a total of 52 services across Microsoft Azure, Microsoft Office 365 and Microsoft Dynamics 365 – far more than all other cloud services combined.”

    Under rules managed by the ASD, government data is classified into four levels: unclassified, protected, secret and top secret.

    Only two cloud providers are currently certified to handle protected-level data, Vault Systems and Sliced Tech.

    Microsoft has been awarded unclassified certification for specific Azure and Office 365 services, and is now actively seeking protected status – which, despite the company’s bullish language and significant investment in Canberra, is not guaranteed and could still be some way off.

    Some 40 Microsoft cloud services have been audited by IRAP assessor Shearwater Solutions, with the assessor recommending 25 of these for protected certification, and the other 15 requiring further work.

    “We’re still working to finalise this certification process with Australian Signals Directorate and for clarity it is important to be aware that Microsoft Azure is not certified at protected level by ASD. We still have work to do, but the pathway is understood,” Kavanagh said.

    Classified information

    Microsoft hopes to eventually be able to offer government agencies an option for all four level of classified data: unclassified and protected on the Azure public cloud, and secret and top secret on Azure Stack.

    Azure Stack, which became generally available in July, offers a public cloud-like experience using on-premises hardware from the likes of Dell EMC, Hewlett-Packard Enterprise and Lenovo.

    James Turner, an analyst from IBRS who was briefed on Microsoft’s plans, said it would be welcome news for public sector IT teams as they tried to align their IT positions with the ASD’s Information Security Manual.

    “Government agencies have the ISM as their external risk compass, but many agencies struggle with two key aspects of ISM compliance. The first is to actually achieve ISM compliance in the challenging areas where things get complicated, and then the second is to maintain compliance in these areas – because they’re complicated,” Turner said.

    “It’s going to be compelling for many CIOs that a vendor like Microsoft, that really gets the enterprise, steps up and says that it’s done the heavy

    View the Original article

  • jkabtech 4:17 am on September 25, 2017 Permalink |
    Tags: Arctic, , , Circle, largest,   

    World’s largest data centre to be built in Arctic Circle 

    Start-up will construct the facility in northern Norway.

    Start-up Kolos is planning to build a record-breaking data centre in the town of Ballangen in northern Norway.

    The data centre will use chilled air and local hydropower to help keep energy costs down, as first reported by the BBC.

    Kolos recently raised investment from a series A funding round involving Norwegian investors which is meant to be worth “several million dollars”, but is working with a US investment bank to secure the remaining funds.

    To make it the largest data centre in the world, it claims that within the decade it will have enough computer servers to draw on 1000MW. Initially, it will only draw 70MW.

    The centre is expected to cover 600,000 sqm and will be a four-storey building. The current record holder measures around 585289.2 sqm and is based in Langfang in China.

    There is also an advanced data centre being built in Nevada by Switch. Dubbed ‘the Citadel’, when finished the facility will measure 668901.9 sq m. The Nevada data centre will use 100 percent renewable energy but will only need up to 650MW.

    In March, Kolos signed a contract of purchase and presented plans for building the data centre.

    “In northern Norway, we actually have Europe’s cheapest power, which is also 100 percent renewable. In addition, Ofoten and Ballangen have extremely good access to dark fiber, which is a prerequisite for running data centres,” Håvard Lillebo, CEO of Kolos, said.

    There are other advantages, too, such as the region’s cold stable climate which will help keep cooling costs low, as well as having a solid power grid. The company also claimed that the nearby river creates a physical barrier to protect the facility.

    Kolos claimed its project will be the largest green data centre in the world, as 100 percent of its energy will be powered by hydropower and wind. Kolos called Norway the world’s leader in green power, making it the perfect place for the project. It hoped to operate a 60 percent reduction in energy costs, providing significant onward cost savings for customers.

    The project is backed by five mayors in the area. Kolos estimates the centre will directly create 2000 to 3000 new jobs and indirectly support 10,000 to 15,000 jobs as a result of people moving to Ballangen.

    Clive Longbottom, principal analyst and founder at Quocrica, said it wasn’t completely clear if the date centre was definitely going ahead, but noted that it was a big project involving a lot of power which will benefit from free-air and chilled water cooling.

    However he said the location was not the easiest place to get to, and underlined that Kolos’s website does not explicitly say what the data centre will be used for.

    “European companies looking for a native data centre are more likely to go for an EU country – and Sweden has a thriving data centre market, as do plenty of others where power may not be so cheap, but getting there could be easier and cheaper (e.g. Holland),” he said.

    View the Original article

  • jkabtech 8:17 pm on September 24, 2017 Permalink |
    Tags: , , , Thinxtra   

    Government buys into IoT network builder Thinxtra 

    Takes $10m equity stake.

    The government has taken a 15 percent equity stake in Thinxtra valued at $10 million to fund a nationwide low-power wide area network for internet of things (IoT) devices and applications.

    Minister of the environment and energy Josh Frydendberg said Thinxtra’s network will be focused on helping Australian companies monitor and reduce energy use.

    Other areas covered by the IoT network include water metering, soil monitoring for farming, and logistics.

    Wireless proprietary technology from French vendor Sigfox will be used for the low-power network, which is meant to cover 95 percent of the population by the end of the year.

    It will integrate with the global Sigfox network. It operates independently of telco IoT networks such as Telstra’s LTE-based network, but connects to them for data backhaul.

    By 2022, the network aims to connect 17 million devices. Thinxtra claims over 150 local businesses have already partnered with the company.

    The funding is made available through the Clean Energy Finance Corporation (CEFC), which invests in emissions reduction initiatives. The money forms part of Thinxtra’s $20 million capital raising drive.

    Through the investment, the CEFC and the government will own 15 percent of Thinxtra.

    Thinxtra is headquartered in Alexandria, NSW and has offices in Auckland and Hong Kong. It is backed by New Zealand technology company Rakon, which makes quartz oscillators and other frequency control devices for use in global positioning systems and guided missiles. Rakon owns 42 percent of Thinxtra.

    View the Original article

  • jkabtech 12:17 pm on September 24, 2017 Permalink |
    Tags: , , savings, shift, Westpac   

    Westpac sees big early savings from cloud shift 

    70 percent fall in infrastructure costs.

    Westpac has seen a two-thirds drop in its infrastructure costs with the first lot of application migrations to its new private cloud.

    The bank kicked off its ‘Agilus’ infrastructure transformation in late 2014. The initiative involves a large-scale shift of between 60-70 percent of Westpac’s applications into private and public cloud environments.

    Private cloud will be used for applications that sit close to Westpac’s core, while commodity-based services closer to the customer will be housed on public cloud.

    Infrastructure costs for 30 “projects” – anything from an entire application to smaller features of one – that Westpac has already migrated to its private cloud set-up have fallen by as much as 70 percent, CIO Dave Curran told iTnews.

    “I’m seeing the infrastructure delivery costs cut to a bit less than a third, for example what used to cost me a million now costs me $300,000,” he said.

    There are a further 50 “projects” waiting to be migrated over the next 12 months. Any new feature that is being built is to go onto either the private or public cloud environment.

    Rather than rearchitect existing applications to run in a cloud environment, Curran’s strategy has been to replace as many homegrown or customised customer-facing systems as possible with off-the-shelf software.

    The big product systems that sit next to the core on-premise, however, are likely to stay that way. Smaller versions of these systems will be containerised where appropriate.

    Curran says this business-centric approach is focused on the outcome that a cloud migration of a particular application would have on Westpac’s customers, shareholders, staff, and the regulator.

    “I’ve got a really simple view. The closer you get to your customer, the more I’m happy to be bespoke and differentiate. The further I get, the more I want efficiency, therefore I want standardisation and all the things that go with that,” Curran said.

    “For product systems, I want standard – so for credit cards I’m on VisionPlus, the customer service hub I’m on Oracle, and what have you.

    “Where we want to differentiate is where we are touching the customer. That should look and feel different based on what Westpac is, what St George is, what our relationship with the customer is.”

    Curran has previously said he expects that once private cloud becomes the norm, there will be a “second set of conversations” around public cloud that will see Westpac’s entire infrastructure moved to a public cloud set-up within ten years.

    Counts down days to first go-live for customer service hub

    The bank is also just months away from the first release under its customer service hub transformation.

    Last December Westpac signed with Oracle to adopt its customer master as the foundation for an integration platform that will link up all the data in product and customer-facing systems to provide a single, holistic view of the customer.

    It will involve an entire rebuild of Westpac’s operations from the middle out.

    The first piece of work on the customer service hub is home ownership, a significant chunk of Westpac’s business and a vertical that touches at least two-thirds of Westpac’s retail and commercial bank systems.

    Westpac had previously said this first release would go live before the end of the year; Curran now says it is ahead of schedule, “unusually for these

    View the Original article

  • jkabtech 4:17 am on September 24, 2017 Permalink |
    Tags: Analysing, , , Songs   

    Analysing 3D Printer Songs For Hacks 

    3D printers have become indispensable in industry sectors such as biomedical and manufacturing, and are deployed as what is termed as a 3D print farm. They help reduce production costs as well as time-to-market. However, a hacker with access to these manufacturing banks can introduce defects such as microfractures and holes that are intended to compromise the quality of the printed component.

    Researchers at the Rutgers University-New Brunswick and Georgia Institute of Technology have published a study on cyber physical attacks and their detection techniques. By monitoring the movement of the extruder using sensors, monitored sounds made by the printer via microphones and finally using structural imaging, they were able to audit the printing process.

    A lot of studies have popped up in the last year or so including papers discussing remote data exfiltration on Makerbots that talk about the type of defects introduced. In a paper by

    View the Original article

    • Frieda Dennis 1:59 am on January 22, 2018 Permalink | Log in to Reply

      Valuable information. Fortunate me I found your site unintentionally, and I’m surprised why this accident didn’t took place in advance! I bookmarked it.


    • Felecia Welch 7:21 pm on January 21, 2018 Permalink | Log in to Reply

      whoah this blog is fantastic i really like reading your articles. Keep up the good paintings! You already know, many individuals are looking around for this information, you could aid them greatly.


  • jkabtech 12:17 pm on September 23, 2017 Permalink |
    Tags: Altitude, Balloon, , , , ,   

    Don’t Miss Watching this Solar Eclipse High Altitude Balloon Online 

    let us know about an exciting project that he and his team are working on at the Solid State Depot Makerspace in Boulder: the Solar Eclipse High Altitude Balloon. Weighing in at 1 kg and bristling with a variety of cameras, the balloon aims to catch whatever images are able to be had during the solar eclipse. The balloon’s position should be trackable on the web during its flight, and some downloaded images should be available as well. Links for all of that are available from the project’s page.

    High altitude balloons are getting more common as a platform for gathering data and doing experiments; an embedded data recorder for balloons was even an entry for the 2016 Hackaday Prize.

    If all goes well and the balloon is able to be recovered, better images and video will follow. If not, then at least a post-mortem of what the team thinks went wrong will be posted. Launch time is approximately 10:40 am Mountain Time (UTC -07:00) on Aug 21 2017, so set your alarm!

    Posted in gps hacks, radio hacksTagged amateur radio, Ballloon, citizen science, eclipse, gps, high altitude balloon, high altitude research, solar eclipse Post navigation← A Digital LCD Makeover For An Analogue CRT Spectrum Analyser Leave a Reply Cancel replyEnter your comment here…

    Fill in your details below or click an icon to log in:

    View the Original article

  • jkabtech 4:17 am on September 23, 2017 Permalink |
    Tags: , , Procrastination, Visualizing   

    Combat Procrastination by Visualizing Your Future Self 

    Image credit:

    I’m typically the queen of procrastination. If something doesn’t have to be done until next week, then I’m more than likely not going to start it until the day before it needs to get done, regardless of whether or not I have plenty of time to complete it between now and then. As I’ve gotten older I’ve gotten better with the whole procrastination thing, but it’s still a problem. Now researchers think they’ve found have a solution: visualizing your successful future self.

    The idea is that everyone kind of sucks at thinking about the future, but if you think about how what you do now will effect you in the future, you’re more likely to make a smart decision that will positively benefit yourself down the line.


    For instance, if I imagine myself giving a killer presentation at work because I prepared a week in advance and had the time to fine-tune what I was going to say, I might be inclined to do that rather than setting myself up to fumble through a presentation I’m ill prepared for because I didn’t start planning until the night before after a few Happy Hour cocktails.

    Hal Hershfield, a professor of marketing at UCLA’s Anderson School of Management and one of the people behind the idea, studies how our perception of time can determine how we make decisions. In his experiments, he had people interact via virtual reality with their future self. People that interacted with themselves in the future were much more likely to put money in a (fake, experiment-based) retirement account and be concerned about the future version of themselves as well as the one in the present.

    The BBC detailed his work this week, as well as the work of several other researchers performing similar studies. The biggest takeaway from everyone is that considering your future self when you make decisions, specifically how your current decisions will impact your life in the future, can help combat procrastination and get you back to work sooner.


    Visualize what the task will look like completed, and the tasks you need to perform to get you to that finish line. If you’re still not inclined to start, think about which one of those tasks is holding you up.

    For me, when I write longer reported studies I tend to put them off because I absolutely loathe transcribing interviews. I love doing the interview and can write a wonderful story really quickly once I get that transcription done, but the need to transcribe a 20-minute interview will make me put off writing a story for weeks.

    Once I discovered that, I found a few people who transcribe for a living (and seem to enjoy it) and started hiring them to do it for me. I removed a speed bump that was dramatically slowing me down and made myself tremendously more efficient in the process.

    So, before putting off that next big task, think about future you. You’ll thank yourself later.

    Recommended Stories

    View the Original article

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: