• Hi All

    Please note that at the Chandoo.org Forums there is Zero Tolerance to Spam

    Post Spam and you Will Be Deleted as a User

    Hui...

  • When starting a new post, to receive a quicker and more targeted answer, Please include a sample file in the initial post.

How to rectify the error code70: permission denied

kopsy

New Member
I'm getting a error saying
Error Code 70: Permission Denied
Debugger throws up the error in the line of Print #intfhbig, stroutbig
Also, this error error is peculiar to me as it was running alright until i was using office2007, recently i migrated to office365 to take some advantages of the newly available formulas and 64bit.., in the beginning, code seem to be working fine but in the continuous run, it throws up an error popup box, interrupting the work, say like five times in an hour. Could somebody help me in this regard, Below is the code, please check. Thank you.

Code:
Option Explicit
Dim runtimerbig As Date
Public Sub bigtoNotepad()
Const timerAmount2019 As String = "00:01:00"
runtimerbig = Now + TimeValue(timerAmount2019)
Application.ScreenUpdating = False
Application.OnTime runtimerbig, "bigtoNotepad"
Dim sfilenamebig As String
Dim intfhbig As Integer
Dim ocellbig As Range
sfilenamebig = "D:\bigcollection.txt"
Close
intfhbig = FreeFile()
Open sfilenamebig For Output As intfhbig
Dim stroutbig As String
For Each ocellbig In Worksheets("NotepadData").Range("A1:A621")
stroutbig = Join(Application.Transpose(Application.Transpose(ocellbig.Resize(1, 10).Value)), vbTab)
Print #intfhbig, stroutbig
Next ocellbig
Close intfhbig
Application.ScreenUpdating = True
End Sub
 
Try...
Code:
Open sfilenamebig For Output As #intfhbig
Dim stroutbig As String
For Each ocellbig In Worksheets("NotepadData").Range("A1:A621")
stroutbig = Join(Application.Transpose(Application.Transpose(ocellbig.Resize(1, 10).Value)), vbTab)
Print #intfhbig, stroutbig
Next ocellbig
Close #intfhbig
 
Thanks for responding.
as you suggested i have prefixed Hash symbol in those two places, and i am testing with it by running for almost 4hours now, and this error appeared for 2 times in this period. But, i would like to test it for some more hours like running it for full day and getting to a conclusion. For now it seem to be the error becomes less frequent.. Will write again after 24 hours.. Thank you :)
 
Try...
Code:
Open sfilenamebig For Output As #intfhbig
Dim stroutbig As String
For Each ocellbig In Worksheets("NotepadData").Range("A1:A621")
stroutbig = Join(Application.Transpose(Application.Transpose(ocellbig.Resize(1, 10).Value)), vbTab)
Print #intfhbig, stroutbig
Next ocellbig
Close #intfhbig


hi, it didn't help as the errors keep appearing, i was testing and running it continuously with close observation , initially for few hours i thought the error got rectified but after prolonged period of time Error Code 70 popped up again and again. What could be the cause, is there a solution for this...
 
That error, is usually generated when you are trying to use object that's already held by some process.

Can't say for sure what's causing it, but likely due to your use of Application.Ontime and process never being closed.

Personally, I don't like using Application.Ontime code and use Task Scheduler along with VBS or some other means to run code on schedule.

However, you may have luck, if you separate out Application.Ontime to separate module and call your sub there. Making sure to clean up variables at each execution. I have sneaking suspicion that your sub never really terminates and end up causing issue by Application.Ontime calling on the sub within it's own process.
 
That error, is usually generated when you are trying to use object that's already held by some process.

Can't say for sure what's causing it, but likely due to your use of Application.Ontime and process never being closed.

Personally, I don't like using Application.Ontime code and use Task Scheduler along with VBS or some other means to run code on schedule.

However, you may have luck, if you separate out Application.Ontime to separate module and call your sub there. Making sure to clean up variables at each execution. I have sneaking suspicion that your sub never really terminates and end up causing issue by Application.Ontime calling on the sub within it's own process.
HI,
I came to know where the problem lies.
yes, the Application.Ontime process will get stopped manually calling it in a separate sub.

Now, little brief about the method i'm using,
I m using two different excel versions, MSO365 is to load all the queries & office2013 for visualization, basically to simplify the work process for CPU.
Due to heavy load in MSO365, I'm transferring the data to a .txt file every 1minute and then office2013 imports the same .txt file every 1minute. Here comes the error, as one is exporting and the other one is importing the same file, when both collides at some point, the Error code 70 appears, When i tested closing the office2013 workbook, then the error didn't appear even for once and office365 keeps exporting the data without any interruption..

Ofcourse i have no answer, how to rectify this error, you please suggest if you have any.. Thank you :)
 
Why not just load data using Get & Transform in O365 and then load directly into data model?

Skipping the need to export to text and also the need to read the text file using another Excel application.

Failing that... may be one of following approach will help.

1. Use ADO code or some other means to import data from txt file into 2013 without opening the file. I'm not 100%, but I believe ADO does not lock up the file.

2. Instead of using Now + 1 min. Make O365 fire on the minute and 2013 at 30 sec mark. Assuming that each code will never take more than 30 sec to complete.
Ex:
To fire code on the minute.
Code:
runtimerbig = Now + TimeSerial(0, 1, -Second(Now))

'Aternately
runtimerbig = Int(24 * 60 * Now + 1) / 24 / 60

For fire code at 30 sec mark every minute...
Code:
SomeVariable = Now + TimeSerial(0, 1, -Second(Now) + 30)

'Alternately
SomeVariable = Int(24 * 60 * Now + 1) / 24 / 60 + 1 / 24 / 60 / 2

There are other ways, but bit hard to give you specifics without knowing full scope of your code (ex: How do you import data from txt file into 2013? etc).
 
Why not just load data using Get & Transform in O365 and then load directly into data model?

Skipping the need to export to text and also the need to read the text file using another Excel application.

Failing that... may be one of following approach will help.

1. Use ADO code or some other means to import data from txt file into 2013 without opening the file. I'm not 100%, but I believe ADO does not lock up the file.

2. Instead of using Now + 1 min. Make O365 fire on the minute and 2013 at 30 sec mark. Assuming that each code will never take more than 30 sec to complete.
Ex:
To fire code on the minute.
Code:
runtimerbig = Now + TimeSerial(0, 1, -Second(Now))

'Aternately
runtimerbig = Int(24 * 60 * Now + 1) / 24 / 60

For fire code at 30 sec mark every minute...
Code:
SomeVariable = Now + TimeSerial(0, 1, -Second(Now) + 30)

'Alternately
SomeVariable = Int(24 * 60 * Now + 1) / 24 / 60 + 1 / 24 / 60 / 2

There are other ways, but bit hard to give you specifics without knowing full scope of your code (ex: How do you import data from txt file into 2013? etc).
I do not wanted to give much burden to the excel365 or CPU, as both the version files exceeds 25mb, equally distributed.

The above tricks to firing codes at separate intervals too didn't help. I assume it could be due to delay in processing lot of power queries.

The ADO sounds very interesting when it can access the data without opening the file, however this topic has very less information available over internet. I have enabled ActiveX Data Objects 6.1 library and thereafter i have no knowledge about how to import the txt file.

I'm importing txt file into 2013 via Data->From Text.
 
Get & Transfrom and then loading data directly into data model.

This process is efficient, and file size should not hit CPU hard at all.

Basically, you can skip loading data on worksheet and create pivot table from data in memory.

It's one of most efficient method to deal with large data set. I routinely deal with 400mb + of data using this method.
Do note that speed is more dependent on RAM and other factors, rather than CPU in most cases.

ADO method, you should find plenty of examples in this forum.
See below for method to retrieve data from text file using ADO.
https://chandoo.org/forum/threads/ado-connection-csv-issue-with-mixed-data-type.34097/

While thread is specific to CSV, TXT file is treated much the same way (use of Schema.ini file etc).
 
Get & Transfrom and then loading data directly into data model.

This process is efficient, and file size should not hit CPU hard at all.

Basically, you can skip loading data on worksheet and create pivot table from data in memory.

It's one of most efficient method to deal with large data set. I routinely deal with 400mb + of data using this method.
Do note that speed is more dependent on RAM and other factors, rather than CPU in most cases.

ADO method, you should find plenty of examples in this forum.
See below for method to retrieve data from text file using ADO.
https://chandoo.org/forum/threads/ado-connection-csv-issue-with-mixed-data-type.34097/

While thread is specific to CSV, TXT file is treated much the same way (use of Schema.ini file etc).
Hi chihiro! First of all I would like to thank you for giving me so many tips and knowledge and I went through them, but also it creates various doubts in my mind.
1. Data Model doesn't have background refresh? so wouldn't i be able to use excel until it completes refresh/load the data from web?
2. Is it really necessary to load the data via pivot table to reduce the CPU burden, as i find pivot table having its own understanding and removing duplicates, summing and doing calculations on its own and not just getting the raw transformed data. Am I missing something here?
3. What happens when i load the data directly to the worksheet from a data model? Will there be any significant change in CPU usage?

It sounds so great to me, that you are able to handle 400mb of data in a single workbook. Though I've got a decent configuration cpu with i7, cas latency16 ram of 16gb and ssd, I'm always afraid of file size going above 50mb, I'm very curious to know the time taken for your workbook to open and also interested knowing your computer configuration.

and coming to the part error code70, I have found an alternative solution with the software called "freefilesync" https://freefilesync.org/manual.php?topic=realtimesync
I have taken the help of mirroring/syncing the text files in two different folders, every 5secs, with the above software, so that i could avoid the same location from being written and read. I guess this method would help in many ways and I believe this won't cause the error again, but still I'm interested to know how we could avoid the error dialog box by adding error handling method by skipping or retry procedure.

Regards!
 
1. I usually turn of Background refresh. But there is option to turn them on (as data should be queried in PowerQuery/Get&Transform first, that's where the option is). Data model is simply where data is kept in memory, queried through connection defined in PQ.

2. Raw data is kept in the data model, in memory. PowerPivot is one of most efficient way to report, slice & dice large data set.

3. It's just waste of resource, as returning large amount of data can impact workbook performance. Typically you'd want to display the processed data for users to consume.

How fast the query is will depend on few things.

1. Speed of your network connection, if querying from remote server/web.

2. Speed of your query execution on the server side (ex: when using native query on the database... for an example, if you use complex join on un-indexed table, that may take longer)

3. Amount of transformation performed in query.

My largest query currently takes about 2 to 3 min, querying from db. But then, it's largely due to db admin designing View to be based off of another View (MSSQL). Which is something that should be avoided for performance sake.

Your PC spec is better than mine by the way ;) I use i7-3630QM, 16Gb RAM but no SSD, and it's laptop that's at least 4 years old.

In my personal opinion, if data must be refreshed every min... data should be summarized to just the portion that is required to make key observation/decisions. No human is able to read 20mb+ of data at a glance every minute...
 
1. I usually turn of Background refresh. But there is option to turn them on (as data should be queried in PowerQuery/Get&Transform first, that's where the option is). Data model is simply where data is kept in memory, queried through connection defined in PQ.

2. Raw data is kept in the data model, in memory. PowerPivot is one of most efficient way to report, slice & dice large data set.

3. It's just waste of resource, as returning large amount of data can impact workbook performance. Typically you'd want to display the processed data for users to consume.

How fast the query is will depend on few things.

1. Speed of your network connection, if querying from remote server/web.

2. Speed of your query execution on the server side (ex: when using native query on the database... for an example, if you use complex join on un-indexed table, that may take longer)

3. Amount of transformation performed in query.

My largest query currently takes about 2 to 3 min, querying from db. But then, it's largely due to db admin designing View to be based off of another View (MSSQL). Which is something that should be avoided for performance sake.

Your PC spec is better than mine by the way ;) I use i7-3630QM, 16Gb RAM but no SSD, and it's laptop that's at least 4 years old.

In my personal opinion, if data must be refreshed every min... data should be summarized to just the portion that is required to make key observation/decisions. No human is able to read 20mb+ of data at a glance every minute...
Thanks chihiro :), i have gained very useful information from you. I just converted everything to data model and i can notice my cpu burden improved substantially. I couldn't take the advantage of pivot table as i wanted to make huge set of calculations on the whole data to make the analysis, so i have no choice other than loading data directly from data model, But i feel its okay for now, slowly i'll try to learn the various inputs you have given.
Thank you for your wonderful time ! :)
 
Back
Top